Oct 02 12:58:53 crc systemd[1]: Starting Kubernetes Kubelet... Oct 02 12:58:54 crc restorecon[4679]: Relabeled /var/lib/kubelet/config.json from system_u:object_r:unlabeled_t:s0 to system_u:object_r:container_var_lib_t:s0 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/device-plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/device-plugins/kubelet.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/volumes/kubernetes.io~configmap/nginx-conf/..2025_02_23_05_40_35.4114275528/nginx.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/22e96971 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/21c98286 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/0f1869e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/46889d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/5b6a5969 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/6c7921f5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4804f443 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/2a46b283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/a6b5573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4f88ee5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/5a4eee4b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/cd87c521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/38602af4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/1483b002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/0346718b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/d3ed4ada not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/3bb473a5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/8cd075a9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/00ab4760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/54a21c09 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/70478888 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/43802770 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/955a0edc not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/bca2d009 not reset as customized by admin to system_u:object_r:container_file_t:s0:c140,c1009 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/b295f9bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/bc46ea27 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5731fc1b not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5e1b2a3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/943f0936 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/3f764ee4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/8695e3f9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/aed7aa86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/c64d7448 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/0ba16bd2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/207a939f not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/54aa8cdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/1f5fa595 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/bf9c8153 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/47fba4ea not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/7ae55ce9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7906a268 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/ce43fa69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7fc7ea3a not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/d8c38b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/9ef015fb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/b9db6a41 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/b1733d79 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/afccd338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/9df0a185 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/18938cf8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/7ab4eb23 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/56930be6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_35.630010865 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/0d8e3722 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/d22b2e76 not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/e036759f not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/2734c483 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/57878fe7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/3f3c2e58 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/375bec3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/7bc41e08 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/48c7a72d not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/4b66701f not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/a5a1c202 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_40.1388695756 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/26f3df5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/6d8fb21d not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/50e94777 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208473b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/ec9e08ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3b787c39 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208eaed5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/93aa3a2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3c697968 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/ba950ec9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/cb5cdb37 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/f2df9827 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/fedaa673 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/9ca2df95 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/b2d7460e not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2207853c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/241c1c29 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2d910eaf not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/c6c0f2e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/399edc97 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8049f7cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/0cec5484 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/312446d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c406,c828 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8e56a35d not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/2d30ddb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/eca8053d not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/c3a25c9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c168,c522 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/b9609c22 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/e8b0eca9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/b36a9c3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/38af7b07 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/ae821620 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/baa23338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/2c534809 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/59b29eae not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/c91a8e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/4d87494a not reset as customized by admin to system_u:object_r:container_file_t:s0:c442,c857 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/1e33ca63 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/8dea7be2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d0b04a99 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d84f01e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/4109059b not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/a7258a3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/05bdf2b6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/f3261b51 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/315d045e not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/5fdcf278 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/d053f757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/c2850dc7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fcfb0b2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c7ac9b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fa0c0d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c609b6ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/2be6c296 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/89a32653 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/4eb9afeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/13af6efa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/b03f9724 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/e3d105cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/3aed4d83 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/0765fa6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/2cefc627 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/3dcc6345 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/365af391 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b1130c0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/236a5913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b9432e26 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/5ddb0e3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/986dc4fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/8a23ff9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/9728ae68 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/665f31d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/136c9b42 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/98a1575b not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/cac69136 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/5deb77a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/2ae53400 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/e46f2326 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/dc688d3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/3497c3cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/177eb008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/af5a2afa not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/d780cb1f not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/49b0f374 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/26fbb125 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/cf14125a not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/b7f86972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/e51d739c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/88ba6a69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/669a9acf not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/5cd51231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/75349ec7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/15c26839 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/45023dcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/2bb66a50 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/64d03bdd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/ab8e7ca0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/bb9be25f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/9a0b61d3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/d471b9d2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/8cb76b8e not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/11a00840 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/ec355a92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/992f735e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d59cdbbc not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/72133ff0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/c56c834c not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d13724c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/0a498258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa471982 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fc900d92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa7d68da not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/4bacf9b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/424021b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/fc2e31a3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/f51eefac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/c8997f2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/7481f599 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/fdafea19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/d0e1c571 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/ee398915 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/682bb6b8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a3e67855 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a989f289 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/915431bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/7796fdab not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/dcdb5f19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/a3aaa88c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/5508e3e6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/160585de not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/e99f8da3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/8bc85570 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/a5861c91 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/84db1135 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/9e1a6043 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/c1aba1c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/d55ccd6d not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/971cc9f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/8f2e3dcf not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/ceb35e9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/1c192745 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/5209e501 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/f83de4df not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/e7b978ac not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/c64304a1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/5384386b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/cce3e3ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/8fb75465 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/740f573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/32fd1134 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/0a861bd3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/80363026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/bfa952a8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..2025_02_23_05_33_31.333075221 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/793bf43d not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/7db1bb6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/4f6a0368 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/c12c7d86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/36c4a773 not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/4c1e98ae not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/a4c8115c not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/setup/7db1802e not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver/a008a7ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-syncer/2c836bac not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-regeneration-controller/0ce62299 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-insecure-readyz/945d2457 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-check-endpoints/7d5c1dd8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/index.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/bundle-v1.15.0.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/channel.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/package.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/bc8d0691 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/6b76097a not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/34d1af30 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/312ba61c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/645d5dd1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/16e825f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/4cf51fc9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/2a23d348 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/075dbd49 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/dd585ddd not reset as customized by admin to system_u:object_r:container_file_t:s0:c377,c642 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/17ebd0ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c343 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/005579f4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_23_11.1287037894 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/bf5f3b9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/af276eb7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/ea28e322 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/692e6683 not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/871746a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/4eb2e958 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/ca9b62da not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/0edd6fce not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/containers/controller-manager/89b4555f not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/655fcd71 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/0d43c002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/e68efd17 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/9acf9b65 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/5ae3ff11 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/1e59206a not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/27af16d1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c304,c1017 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/7918e729 not reset as customized by admin to system_u:object_r:container_file_t:s0:c853,c893 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/5d976d0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c585,c981 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/d7f55cbb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/f0812073 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/1a56cbeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/7fdd437e not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/cdfb5652 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/fix-audit-permissions/fb93119e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver/f1e8fc0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver-check-endpoints/218511f3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server/serving-certs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/ca8af7b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/72cc8a75 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/6e8a3760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4c3455c0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/2278acb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4b453e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/3ec09bda not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2/cacerts.bin not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java/cacerts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl/ca-bundle.trust.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/email-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/objsign-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2ae6433e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fde84897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75680d2e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/openshift-service-serving-signer_1740288168.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/facfc4fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f5a969c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CFCA_EV_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9ef4a08a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ingress-operator_1740288202.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2f332aed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/248c8271.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d10a21f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ACCVRAIZ1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a94d09e5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c9a4d3b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40193066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd8c0d63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b936d1c6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CA_Disig_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4fd49c6c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM_SERVIDORES_SEGUROS.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b81b93f0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f9a69fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b30d5fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ANF_Secure_Server_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b433981b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93851c9e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9282e51c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7dd1bc4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Actalis_Authentication_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/930ac5d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f47b495.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e113c810.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5931b5bc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Commercial.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2b349938.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e48193cf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/302904dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a716d4ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Networking.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93bc0acc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/86212b19.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b727005e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbc54cab.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f51bb24c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c28a8a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9c8dfbd4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ccc52f49.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cb1c3204.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ce5e74ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd08c599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6d41d539.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb5fa911.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e35234b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8cb5ee0f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a7c655d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f8fc53da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/de6d66f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d41b5e2a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/41a3f684.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1df5a75f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_2011.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e36a6752.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b872f2b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9576d26b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/228f89db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_ECC_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb717492.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d21b73c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b1b94ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/595e996b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_RSA_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b46e03d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/128f4b91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_3_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81f2d2b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Autoridad_de_Certificacion_Firmaprofesional_CIF_A62634068.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3bde41ac.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d16a5865.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_EC-384_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0179095f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ffa7f1eb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9482e63a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4dae3dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e359ba6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7e067d03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/95aff9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7746a63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Baltimore_CyberTrust_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/653b494a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3ad48a91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_2_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/54657681.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/82223c44.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8de2f56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d9dafe4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d96b65e2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee64a828.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40547a79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5a3f0ff8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a780d93.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/34d996fb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/eed8c118.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/89c02a45.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b1159c4c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d6325660.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4c339cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8312c4c1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_E1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8508e720.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5fdd185d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48bec511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/69105f4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b9bc432.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/32888f65.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b03dec0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/219d9499.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5acf816d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbf06781.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc99f41e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AAA_Certificate_Services.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/985c1f52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8794b4e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_BR_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7c037b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ef954a4e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_EV_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2add47b6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/90c5a3c8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0f3e76e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/53a1b57a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_EV_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5ad8a5d6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/68dd7389.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d04f354.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d6437c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/062cdee6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bd43e1dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7f3d5d1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c491639e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3513523f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/399e7759.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/feffd413.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d18e9066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/607986c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c90bc37d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1b0f7e5c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e08bfd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dd8e9d41.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed39abd0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a3418fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bc3f2570.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_High_Assurance_EV_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/244b5494.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81b9768f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4be590e0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_ECC_P384_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9846683b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/252252d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e8e7201.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_RSA4096_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d52c538d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c44cc0c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Trusted_Root_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75d1b2ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a2c66da8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ecccd8db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust.net_Certification_Authority__2048_.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/aee5f10d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e7271e8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0e59380.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4c3982f2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b99d060.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf64f35b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0a775a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/002c0b4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cc450945.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_EC1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/106f3e4d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b3fb433b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4042bcee.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/02265526.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/455f1b52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0d69c7e1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9f727ac7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5e98733a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0cd152c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc4d6a89.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6187b673.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/FIRMAPROFESIONAL_CA_ROOT-A_WEB.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ba8887ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/068570d1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f081611a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48a195d8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GDCA_TrustAUTH_R5_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f6fa695.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab59055e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b92fd57f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GLOBALTRUST_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fa5da96b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ec40989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7719f463.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1001acf7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f013ecaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/626dceaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c559d742.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1d3472b9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9479c8c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a81e292b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4bfab552.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e071171e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/57bcb2da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_ECC_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab5346f4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5046c355.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_RSA_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/865fbdf9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da0cfd1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/85cde254.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_ECC_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbb3f32b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureSign_RootCA11.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5860aaa6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/31188b5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HiPKI_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c7f1359b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f15c80c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hongkong_Post_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/09789157.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/18856ac4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e09d511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Commercial_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cf701eeb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d06393bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Public_Sector_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/10531352.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Izenpe.com.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureTrust_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0ed035a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsec_e-Szigno_Root_CA_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8160b96c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8651083.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2c63f966.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_ECC_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d89cda1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/01419da9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_RSA_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7a5b843.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_RSA_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf53fb88.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9591a472.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3afde786.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Gold_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NAVER_Global_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3fb36b73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d39b0a2c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a89d74c2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd58d51e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7db1890.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NetLock_Arany__Class_Gold__F__tan__s__tv__ny.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/988a38cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/60afe812.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f39fc864.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5443e9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GB_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e73d606e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dfc0fe80.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b66938e9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e1eab7c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GC_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/773e07ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c899c73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d59297b8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ddcda989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_1_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/749e9e03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/52b525c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7e8dc79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a819ef2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/08063a00.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b483515.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/064e0aa9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1f58a078.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6f7454b3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7fa05551.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76faf6c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9339512a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f387163d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee37c333.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e18bfb83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e442e424.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fe8a2cd8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/23f4c490.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5cd81ad7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0c70a8d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7892ad52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SZAFIR_ROOT_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4f316efb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_RSA_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/06dc52d5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/583d0756.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0bf05006.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/88950faa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9046744a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c860d51.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_RSA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6fa5da56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/33ee480d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Secure_Global_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/63a2c897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_ECC_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bdacca6f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ff34af3f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbff3a01.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_ECC_RootCA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_C1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/406c9bb1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_C3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Services_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Silver_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/99e1b953.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/14bc7599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TUBITAK_Kamu_SM_SSL_Kok_Sertifikasi_-_Surum_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a3adc42.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f459871d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_ECC_Root_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_RSA_Root_2023.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TeliaSonera_Root_CA_v1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telia_Root_CA_v2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f103249.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f058632f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-certificates.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9bf03295.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/98aaf404.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1cef98f5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/073bfcc5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2923b3f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f249de83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/edcbddb5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P256_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b5697b0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ae85e5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b74d2bd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P384_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d887a5bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9aef356c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TunTrust_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd64f3fc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e13665f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Extended_Validation_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f5dc4f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da7377f6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Global_G2_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c01eb047.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/304d27c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed858448.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f30dd6ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/04f60c28.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_ECC_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fc5a8f99.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/35105088.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee532fd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/XRamp_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/706f604c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76579174.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d86cdd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/882de061.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f618aec.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a9d40e02.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e-Szigno_Root_CA_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e868b802.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/83e9984f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ePKI_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca6e4ad9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d6523ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4b718d9b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/869fbf79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/containers/registry/f8d22bdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/6e8bbfac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/54dd7996 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/a4f1bb05 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/207129da not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/c1df39e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/15b8f1cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/77bd6913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/2382c1b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/704ce128 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/70d16fe0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/bfb95535 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/57a8e8e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/1b9d3e5e not reset as customized by admin to system_u:object_r:container_file_t:s0:c107,c917 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/fddb173c not reset as customized by admin to system_u:object_r:container_file_t:s0:c202,c983 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/95d3c6c4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/bfb5fff5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/2aef40aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/c0391cad not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/1119e69d not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/660608b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/8220bd53 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/85f99d5c not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/4b0225f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/9c2a3394 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/e820b243 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/1ca52ea0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/e6988e45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/6655f00b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/98bc3986 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/08e3458a not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/2a191cb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/6c4eeefb not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/f61a549c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/24891863 not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/fbdfd89c not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/9b63b3bc not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/8acde6d6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/node-driver-registrar/59ecbba3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/csi-provisioner/685d4be3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/containers/route-controller-manager/feaea55e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:54 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/63709497 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/d966b7fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/f5773757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/81c9edb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/57bf57ee not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/86f5e6aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/0aabe31d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/d2af85c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/09d157d9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c0fe7256 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c30319e4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/e6b1dd45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/2bb643f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/920de426 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/70fa1e87 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/a1c12a2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/9442e6c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/5b45ec72 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/3c9f3a59 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/1091c11b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/9a6821c6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/ec0c35e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/517f37e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/6214fe78 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/ba189c8b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/351e4f31 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/c0f219ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/8069f607 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/559c3d82 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/605ad488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/148df488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/3bf6dcb4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/022a2feb not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/938c3924 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/729fe23e not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/1fd5cbd4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/a96697e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/e155ddca not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/10dd0e0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/6f2c8392 not reset as customized by admin to system_u:object_r:container_file_t:s0:c267,c588 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/bd241ad9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/plugins/csi-hostpath not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/plugins/csi-hostpath/csi.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/plugins/kubernetes.io not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/plugins/kubernetes.io/csi not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983 not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/vol_data.json not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 02 12:58:55 crc restorecon[4679]: /var/lib/kubelet/plugins_registry not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 02 12:58:55 crc restorecon[4679]: Relabeled /var/usrlocal/bin/kubenswrapper from system_u:object_r:bin_t:s0 to system_u:object_r:kubelet_exec_t:s0 Oct 02 12:58:56 crc kubenswrapper[4724]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Oct 02 12:58:56 crc kubenswrapper[4724]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Oct 02 12:58:56 crc kubenswrapper[4724]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Oct 02 12:58:56 crc kubenswrapper[4724]: Flag --register-with-taints has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Oct 02 12:58:56 crc kubenswrapper[4724]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Oct 02 12:58:56 crc kubenswrapper[4724]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.037951 4724 server.go:211] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Oct 02 12:58:56 crc kubenswrapper[4724]: W1002 12:58:56.045465 4724 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Oct 02 12:58:56 crc kubenswrapper[4724]: W1002 12:58:56.045507 4724 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Oct 02 12:58:56 crc kubenswrapper[4724]: W1002 12:58:56.045511 4724 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Oct 02 12:58:56 crc kubenswrapper[4724]: W1002 12:58:56.045516 4724 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Oct 02 12:58:56 crc kubenswrapper[4724]: W1002 12:58:56.045519 4724 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Oct 02 12:58:56 crc kubenswrapper[4724]: W1002 12:58:56.045524 4724 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Oct 02 12:58:56 crc kubenswrapper[4724]: W1002 12:58:56.045528 4724 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Oct 02 12:58:56 crc kubenswrapper[4724]: W1002 12:58:56.045549 4724 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Oct 02 12:58:56 crc kubenswrapper[4724]: W1002 12:58:56.045554 4724 feature_gate.go:330] unrecognized feature gate: OVNObservability Oct 02 12:58:56 crc kubenswrapper[4724]: W1002 12:58:56.045557 4724 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Oct 02 12:58:56 crc kubenswrapper[4724]: W1002 12:58:56.045561 4724 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Oct 02 12:58:56 crc kubenswrapper[4724]: W1002 12:58:56.045565 4724 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Oct 02 12:58:56 crc kubenswrapper[4724]: W1002 12:58:56.045568 4724 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Oct 02 12:58:56 crc kubenswrapper[4724]: W1002 12:58:56.045572 4724 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Oct 02 12:58:56 crc kubenswrapper[4724]: W1002 12:58:56.045575 4724 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Oct 02 12:58:56 crc kubenswrapper[4724]: W1002 12:58:56.045579 4724 feature_gate.go:330] unrecognized feature gate: PinnedImages Oct 02 12:58:56 crc kubenswrapper[4724]: W1002 12:58:56.045582 4724 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Oct 02 12:58:56 crc kubenswrapper[4724]: W1002 12:58:56.045586 4724 feature_gate.go:330] unrecognized feature gate: GatewayAPI Oct 02 12:58:56 crc kubenswrapper[4724]: W1002 12:58:56.045589 4724 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Oct 02 12:58:56 crc kubenswrapper[4724]: W1002 12:58:56.045593 4724 feature_gate.go:330] unrecognized feature gate: NewOLM Oct 02 12:58:56 crc kubenswrapper[4724]: W1002 12:58:56.045596 4724 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Oct 02 12:58:56 crc kubenswrapper[4724]: W1002 12:58:56.045600 4724 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Oct 02 12:58:56 crc kubenswrapper[4724]: W1002 12:58:56.045603 4724 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Oct 02 12:58:56 crc kubenswrapper[4724]: W1002 12:58:56.045609 4724 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Oct 02 12:58:56 crc kubenswrapper[4724]: W1002 12:58:56.045622 4724 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Oct 02 12:58:56 crc kubenswrapper[4724]: W1002 12:58:56.045627 4724 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Oct 02 12:58:56 crc kubenswrapper[4724]: W1002 12:58:56.045631 4724 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Oct 02 12:58:56 crc kubenswrapper[4724]: W1002 12:58:56.045637 4724 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Oct 02 12:58:56 crc kubenswrapper[4724]: W1002 12:58:56.045641 4724 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Oct 02 12:58:56 crc kubenswrapper[4724]: W1002 12:58:56.045645 4724 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Oct 02 12:58:56 crc kubenswrapper[4724]: W1002 12:58:56.045649 4724 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Oct 02 12:58:56 crc kubenswrapper[4724]: W1002 12:58:56.045653 4724 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Oct 02 12:58:56 crc kubenswrapper[4724]: W1002 12:58:56.045656 4724 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Oct 02 12:58:56 crc kubenswrapper[4724]: W1002 12:58:56.045660 4724 feature_gate.go:330] unrecognized feature gate: SignatureStores Oct 02 12:58:56 crc kubenswrapper[4724]: W1002 12:58:56.045664 4724 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Oct 02 12:58:56 crc kubenswrapper[4724]: W1002 12:58:56.045667 4724 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Oct 02 12:58:56 crc kubenswrapper[4724]: W1002 12:58:56.045671 4724 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Oct 02 12:58:56 crc kubenswrapper[4724]: W1002 12:58:56.045675 4724 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Oct 02 12:58:56 crc kubenswrapper[4724]: W1002 12:58:56.045679 4724 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Oct 02 12:58:56 crc kubenswrapper[4724]: W1002 12:58:56.045682 4724 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Oct 02 12:58:56 crc kubenswrapper[4724]: W1002 12:58:56.045686 4724 feature_gate.go:330] unrecognized feature gate: Example Oct 02 12:58:56 crc kubenswrapper[4724]: W1002 12:58:56.045689 4724 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Oct 02 12:58:56 crc kubenswrapper[4724]: W1002 12:58:56.045693 4724 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Oct 02 12:58:56 crc kubenswrapper[4724]: W1002 12:58:56.045696 4724 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Oct 02 12:58:56 crc kubenswrapper[4724]: W1002 12:58:56.045707 4724 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Oct 02 12:58:56 crc kubenswrapper[4724]: W1002 12:58:56.045711 4724 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Oct 02 12:58:56 crc kubenswrapper[4724]: W1002 12:58:56.045714 4724 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Oct 02 12:58:56 crc kubenswrapper[4724]: W1002 12:58:56.045719 4724 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Oct 02 12:58:56 crc kubenswrapper[4724]: W1002 12:58:56.045723 4724 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Oct 02 12:58:56 crc kubenswrapper[4724]: W1002 12:58:56.045727 4724 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Oct 02 12:58:56 crc kubenswrapper[4724]: W1002 12:58:56.045731 4724 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Oct 02 12:58:56 crc kubenswrapper[4724]: W1002 12:58:56.045736 4724 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Oct 02 12:58:56 crc kubenswrapper[4724]: W1002 12:58:56.045741 4724 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Oct 02 12:58:56 crc kubenswrapper[4724]: W1002 12:58:56.045745 4724 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Oct 02 12:58:56 crc kubenswrapper[4724]: W1002 12:58:56.045749 4724 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Oct 02 12:58:56 crc kubenswrapper[4724]: W1002 12:58:56.045754 4724 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Oct 02 12:58:56 crc kubenswrapper[4724]: W1002 12:58:56.045758 4724 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Oct 02 12:58:56 crc kubenswrapper[4724]: W1002 12:58:56.045761 4724 feature_gate.go:330] unrecognized feature gate: PlatformOperators Oct 02 12:58:56 crc kubenswrapper[4724]: W1002 12:58:56.045765 4724 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Oct 02 12:58:56 crc kubenswrapper[4724]: W1002 12:58:56.045770 4724 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Oct 02 12:58:56 crc kubenswrapper[4724]: W1002 12:58:56.045774 4724 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Oct 02 12:58:56 crc kubenswrapper[4724]: W1002 12:58:56.045778 4724 feature_gate.go:330] unrecognized feature gate: InsightsConfig Oct 02 12:58:56 crc kubenswrapper[4724]: W1002 12:58:56.045782 4724 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Oct 02 12:58:56 crc kubenswrapper[4724]: W1002 12:58:56.045787 4724 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Oct 02 12:58:56 crc kubenswrapper[4724]: W1002 12:58:56.045790 4724 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Oct 02 12:58:56 crc kubenswrapper[4724]: W1002 12:58:56.045794 4724 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Oct 02 12:58:56 crc kubenswrapper[4724]: W1002 12:58:56.045798 4724 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Oct 02 12:58:56 crc kubenswrapper[4724]: W1002 12:58:56.045801 4724 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Oct 02 12:58:56 crc kubenswrapper[4724]: W1002 12:58:56.045805 4724 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Oct 02 12:58:56 crc kubenswrapper[4724]: W1002 12:58:56.045809 4724 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Oct 02 12:58:56 crc kubenswrapper[4724]: W1002 12:58:56.045812 4724 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.045921 4724 flags.go:64] FLAG: --address="0.0.0.0" Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.045934 4724 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.045944 4724 flags.go:64] FLAG: --anonymous-auth="true" Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.045953 4724 flags.go:64] FLAG: --application-metrics-count-limit="100" Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.045960 4724 flags.go:64] FLAG: --authentication-token-webhook="false" Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.045966 4724 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.045973 4724 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.045981 4724 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.045987 4724 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.045993 4724 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.045998 4724 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.046004 4724 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.046009 4724 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.046014 4724 flags.go:64] FLAG: --cgroup-root="" Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.046018 4724 flags.go:64] FLAG: --cgroups-per-qos="true" Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.046023 4724 flags.go:64] FLAG: --client-ca-file="" Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.046028 4724 flags.go:64] FLAG: --cloud-config="" Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.046033 4724 flags.go:64] FLAG: --cloud-provider="" Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.046038 4724 flags.go:64] FLAG: --cluster-dns="[]" Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.046045 4724 flags.go:64] FLAG: --cluster-domain="" Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.046050 4724 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.046055 4724 flags.go:64] FLAG: --config-dir="" Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.046060 4724 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.046066 4724 flags.go:64] FLAG: --container-log-max-files="5" Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.046073 4724 flags.go:64] FLAG: --container-log-max-size="10Mi" Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.046080 4724 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.046085 4724 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.046090 4724 flags.go:64] FLAG: --containerd-namespace="k8s.io" Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.046095 4724 flags.go:64] FLAG: --contention-profiling="false" Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.046099 4724 flags.go:64] FLAG: --cpu-cfs-quota="true" Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.046104 4724 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.046108 4724 flags.go:64] FLAG: --cpu-manager-policy="none" Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.046112 4724 flags.go:64] FLAG: --cpu-manager-policy-options="" Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.046119 4724 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.046124 4724 flags.go:64] FLAG: --enable-controller-attach-detach="true" Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.046130 4724 flags.go:64] FLAG: --enable-debugging-handlers="true" Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.046135 4724 flags.go:64] FLAG: --enable-load-reader="false" Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.046140 4724 flags.go:64] FLAG: --enable-server="true" Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.046146 4724 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.046154 4724 flags.go:64] FLAG: --event-burst="100" Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.046159 4724 flags.go:64] FLAG: --event-qps="50" Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.046165 4724 flags.go:64] FLAG: --event-storage-age-limit="default=0" Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.046170 4724 flags.go:64] FLAG: --event-storage-event-limit="default=0" Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.046176 4724 flags.go:64] FLAG: --eviction-hard="" Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.046183 4724 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.046188 4724 flags.go:64] FLAG: --eviction-minimum-reclaim="" Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.046194 4724 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.046200 4724 flags.go:64] FLAG: --eviction-soft="" Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.046206 4724 flags.go:64] FLAG: --eviction-soft-grace-period="" Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.046211 4724 flags.go:64] FLAG: --exit-on-lock-contention="false" Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.046216 4724 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.046221 4724 flags.go:64] FLAG: --experimental-mounter-path="" Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.046226 4724 flags.go:64] FLAG: --fail-cgroupv1="false" Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.046231 4724 flags.go:64] FLAG: --fail-swap-on="true" Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.046236 4724 flags.go:64] FLAG: --feature-gates="" Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.046242 4724 flags.go:64] FLAG: --file-check-frequency="20s" Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.046246 4724 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.046250 4724 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.046256 4724 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.046261 4724 flags.go:64] FLAG: --healthz-port="10248" Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.046266 4724 flags.go:64] FLAG: --help="false" Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.046272 4724 flags.go:64] FLAG: --hostname-override="" Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.046278 4724 flags.go:64] FLAG: --housekeeping-interval="10s" Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.046283 4724 flags.go:64] FLAG: --http-check-frequency="20s" Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.046289 4724 flags.go:64] FLAG: --image-credential-provider-bin-dir="" Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.046294 4724 flags.go:64] FLAG: --image-credential-provider-config="" Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.046299 4724 flags.go:64] FLAG: --image-gc-high-threshold="85" Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.046304 4724 flags.go:64] FLAG: --image-gc-low-threshold="80" Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.046308 4724 flags.go:64] FLAG: --image-service-endpoint="" Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.046313 4724 flags.go:64] FLAG: --kernel-memcg-notification="false" Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.046319 4724 flags.go:64] FLAG: --kube-api-burst="100" Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.046324 4724 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.046329 4724 flags.go:64] FLAG: --kube-api-qps="50" Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.046334 4724 flags.go:64] FLAG: --kube-reserved="" Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.046339 4724 flags.go:64] FLAG: --kube-reserved-cgroup="" Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.046345 4724 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.046351 4724 flags.go:64] FLAG: --kubelet-cgroups="" Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.046355 4724 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.046361 4724 flags.go:64] FLAG: --lock-file="" Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.046367 4724 flags.go:64] FLAG: --log-cadvisor-usage="false" Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.046373 4724 flags.go:64] FLAG: --log-flush-frequency="5s" Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.046378 4724 flags.go:64] FLAG: --log-json-info-buffer-size="0" Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.046387 4724 flags.go:64] FLAG: --log-json-split-stream="false" Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.046392 4724 flags.go:64] FLAG: --log-text-info-buffer-size="0" Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.046398 4724 flags.go:64] FLAG: --log-text-split-stream="false" Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.046403 4724 flags.go:64] FLAG: --logging-format="text" Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.046408 4724 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.046414 4724 flags.go:64] FLAG: --make-iptables-util-chains="true" Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.046419 4724 flags.go:64] FLAG: --manifest-url="" Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.046424 4724 flags.go:64] FLAG: --manifest-url-header="" Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.046431 4724 flags.go:64] FLAG: --max-housekeeping-interval="15s" Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.046437 4724 flags.go:64] FLAG: --max-open-files="1000000" Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.046443 4724 flags.go:64] FLAG: --max-pods="110" Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.046448 4724 flags.go:64] FLAG: --maximum-dead-containers="-1" Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.046453 4724 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.046458 4724 flags.go:64] FLAG: --memory-manager-policy="None" Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.046464 4724 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.046470 4724 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.046476 4724 flags.go:64] FLAG: --node-ip="192.168.126.11" Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.046482 4724 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/control-plane=,node-role.kubernetes.io/master=,node.openshift.io/os_id=rhcos" Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.046497 4724 flags.go:64] FLAG: --node-status-max-images="50" Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.046503 4724 flags.go:64] FLAG: --node-status-update-frequency="10s" Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.046508 4724 flags.go:64] FLAG: --oom-score-adj="-999" Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.046513 4724 flags.go:64] FLAG: --pod-cidr="" Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.046518 4724 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:33549946e22a9ffa738fd94b1345f90921bc8f92fa6137784cb33c77ad806f9d" Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.046527 4724 flags.go:64] FLAG: --pod-manifest-path="" Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.046548 4724 flags.go:64] FLAG: --pod-max-pids="-1" Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.046554 4724 flags.go:64] FLAG: --pods-per-core="0" Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.046559 4724 flags.go:64] FLAG: --port="10250" Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.046563 4724 flags.go:64] FLAG: --protect-kernel-defaults="false" Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.046568 4724 flags.go:64] FLAG: --provider-id="" Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.046573 4724 flags.go:64] FLAG: --qos-reserved="" Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.046579 4724 flags.go:64] FLAG: --read-only-port="10255" Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.046584 4724 flags.go:64] FLAG: --register-node="true" Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.046588 4724 flags.go:64] FLAG: --register-schedulable="true" Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.046593 4724 flags.go:64] FLAG: --register-with-taints="node-role.kubernetes.io/master=:NoSchedule" Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.046603 4724 flags.go:64] FLAG: --registry-burst="10" Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.046638 4724 flags.go:64] FLAG: --registry-qps="5" Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.046643 4724 flags.go:64] FLAG: --reserved-cpus="" Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.046648 4724 flags.go:64] FLAG: --reserved-memory="" Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.046655 4724 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.046660 4724 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.046665 4724 flags.go:64] FLAG: --rotate-certificates="false" Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.046670 4724 flags.go:64] FLAG: --rotate-server-certificates="false" Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.046674 4724 flags.go:64] FLAG: --runonce="false" Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.046679 4724 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.046684 4724 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.046689 4724 flags.go:64] FLAG: --seccomp-default="false" Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.046694 4724 flags.go:64] FLAG: --serialize-image-pulls="true" Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.046699 4724 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.046704 4724 flags.go:64] FLAG: --storage-driver-db="cadvisor" Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.046710 4724 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.046715 4724 flags.go:64] FLAG: --storage-driver-password="root" Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.046721 4724 flags.go:64] FLAG: --storage-driver-secure="false" Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.046727 4724 flags.go:64] FLAG: --storage-driver-table="stats" Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.046732 4724 flags.go:64] FLAG: --storage-driver-user="root" Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.046736 4724 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.046741 4724 flags.go:64] FLAG: --sync-frequency="1m0s" Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.046746 4724 flags.go:64] FLAG: --system-cgroups="" Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.046757 4724 flags.go:64] FLAG: --system-reserved="cpu=200m,ephemeral-storage=350Mi,memory=350Mi" Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.046766 4724 flags.go:64] FLAG: --system-reserved-cgroup="" Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.046771 4724 flags.go:64] FLAG: --tls-cert-file="" Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.046777 4724 flags.go:64] FLAG: --tls-cipher-suites="[]" Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.046783 4724 flags.go:64] FLAG: --tls-min-version="" Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.046790 4724 flags.go:64] FLAG: --tls-private-key-file="" Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.046795 4724 flags.go:64] FLAG: --topology-manager-policy="none" Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.046800 4724 flags.go:64] FLAG: --topology-manager-policy-options="" Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.046806 4724 flags.go:64] FLAG: --topology-manager-scope="container" Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.046811 4724 flags.go:64] FLAG: --v="2" Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.046819 4724 flags.go:64] FLAG: --version="false" Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.046826 4724 flags.go:64] FLAG: --vmodule="" Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.046833 4724 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.046839 4724 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Oct 02 12:58:56 crc kubenswrapper[4724]: W1002 12:58:56.046979 4724 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Oct 02 12:58:56 crc kubenswrapper[4724]: W1002 12:58:56.046987 4724 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Oct 02 12:58:56 crc kubenswrapper[4724]: W1002 12:58:56.046993 4724 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Oct 02 12:58:56 crc kubenswrapper[4724]: W1002 12:58:56.046998 4724 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Oct 02 12:58:56 crc kubenswrapper[4724]: W1002 12:58:56.047003 4724 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Oct 02 12:58:56 crc kubenswrapper[4724]: W1002 12:58:56.047007 4724 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Oct 02 12:58:56 crc kubenswrapper[4724]: W1002 12:58:56.047012 4724 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Oct 02 12:58:56 crc kubenswrapper[4724]: W1002 12:58:56.047016 4724 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Oct 02 12:58:56 crc kubenswrapper[4724]: W1002 12:58:56.047021 4724 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Oct 02 12:58:56 crc kubenswrapper[4724]: W1002 12:58:56.047025 4724 feature_gate.go:330] unrecognized feature gate: GatewayAPI Oct 02 12:58:56 crc kubenswrapper[4724]: W1002 12:58:56.047029 4724 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Oct 02 12:58:56 crc kubenswrapper[4724]: W1002 12:58:56.047034 4724 feature_gate.go:330] unrecognized feature gate: OVNObservability Oct 02 12:58:56 crc kubenswrapper[4724]: W1002 12:58:56.047038 4724 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Oct 02 12:58:56 crc kubenswrapper[4724]: W1002 12:58:56.047042 4724 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Oct 02 12:58:56 crc kubenswrapper[4724]: W1002 12:58:56.047046 4724 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Oct 02 12:58:56 crc kubenswrapper[4724]: W1002 12:58:56.047051 4724 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Oct 02 12:58:56 crc kubenswrapper[4724]: W1002 12:58:56.047056 4724 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Oct 02 12:58:56 crc kubenswrapper[4724]: W1002 12:58:56.047060 4724 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Oct 02 12:58:56 crc kubenswrapper[4724]: W1002 12:58:56.047069 4724 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Oct 02 12:58:56 crc kubenswrapper[4724]: W1002 12:58:56.047073 4724 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Oct 02 12:58:56 crc kubenswrapper[4724]: W1002 12:58:56.047077 4724 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Oct 02 12:58:56 crc kubenswrapper[4724]: W1002 12:58:56.047082 4724 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Oct 02 12:58:56 crc kubenswrapper[4724]: W1002 12:58:56.047088 4724 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Oct 02 12:58:56 crc kubenswrapper[4724]: W1002 12:58:56.047095 4724 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Oct 02 12:58:56 crc kubenswrapper[4724]: W1002 12:58:56.047100 4724 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Oct 02 12:58:56 crc kubenswrapper[4724]: W1002 12:58:56.047104 4724 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Oct 02 12:58:56 crc kubenswrapper[4724]: W1002 12:58:56.047109 4724 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Oct 02 12:58:56 crc kubenswrapper[4724]: W1002 12:58:56.047113 4724 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Oct 02 12:58:56 crc kubenswrapper[4724]: W1002 12:58:56.047118 4724 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Oct 02 12:58:56 crc kubenswrapper[4724]: W1002 12:58:56.047123 4724 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Oct 02 12:58:56 crc kubenswrapper[4724]: W1002 12:58:56.047127 4724 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Oct 02 12:58:56 crc kubenswrapper[4724]: W1002 12:58:56.047131 4724 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Oct 02 12:58:56 crc kubenswrapper[4724]: W1002 12:58:56.047136 4724 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Oct 02 12:58:56 crc kubenswrapper[4724]: W1002 12:58:56.047140 4724 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Oct 02 12:58:56 crc kubenswrapper[4724]: W1002 12:58:56.047144 4724 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Oct 02 12:58:56 crc kubenswrapper[4724]: W1002 12:58:56.047148 4724 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Oct 02 12:58:56 crc kubenswrapper[4724]: W1002 12:58:56.047153 4724 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Oct 02 12:58:56 crc kubenswrapper[4724]: W1002 12:58:56.047157 4724 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Oct 02 12:58:56 crc kubenswrapper[4724]: W1002 12:58:56.047162 4724 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Oct 02 12:58:56 crc kubenswrapper[4724]: W1002 12:58:56.047166 4724 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Oct 02 12:58:56 crc kubenswrapper[4724]: W1002 12:58:56.047170 4724 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Oct 02 12:58:56 crc kubenswrapper[4724]: W1002 12:58:56.047174 4724 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Oct 02 12:58:56 crc kubenswrapper[4724]: W1002 12:58:56.047179 4724 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Oct 02 12:58:56 crc kubenswrapper[4724]: W1002 12:58:56.047183 4724 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Oct 02 12:58:56 crc kubenswrapper[4724]: W1002 12:58:56.047188 4724 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Oct 02 12:58:56 crc kubenswrapper[4724]: W1002 12:58:56.047192 4724 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Oct 02 12:58:56 crc kubenswrapper[4724]: W1002 12:58:56.047197 4724 feature_gate.go:330] unrecognized feature gate: Example Oct 02 12:58:56 crc kubenswrapper[4724]: W1002 12:58:56.047201 4724 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Oct 02 12:58:56 crc kubenswrapper[4724]: W1002 12:58:56.047206 4724 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Oct 02 12:58:56 crc kubenswrapper[4724]: W1002 12:58:56.047210 4724 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Oct 02 12:58:56 crc kubenswrapper[4724]: W1002 12:58:56.047217 4724 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Oct 02 12:58:56 crc kubenswrapper[4724]: W1002 12:58:56.047223 4724 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Oct 02 12:58:56 crc kubenswrapper[4724]: W1002 12:58:56.047230 4724 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Oct 02 12:58:56 crc kubenswrapper[4724]: W1002 12:58:56.047234 4724 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Oct 02 12:58:56 crc kubenswrapper[4724]: W1002 12:58:56.047239 4724 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Oct 02 12:58:56 crc kubenswrapper[4724]: W1002 12:58:56.047244 4724 feature_gate.go:330] unrecognized feature gate: PinnedImages Oct 02 12:58:56 crc kubenswrapper[4724]: W1002 12:58:56.047248 4724 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Oct 02 12:58:56 crc kubenswrapper[4724]: W1002 12:58:56.047253 4724 feature_gate.go:330] unrecognized feature gate: PlatformOperators Oct 02 12:58:56 crc kubenswrapper[4724]: W1002 12:58:56.047257 4724 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Oct 02 12:58:56 crc kubenswrapper[4724]: W1002 12:58:56.047262 4724 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Oct 02 12:58:56 crc kubenswrapper[4724]: W1002 12:58:56.047266 4724 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Oct 02 12:58:56 crc kubenswrapper[4724]: W1002 12:58:56.047271 4724 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Oct 02 12:58:56 crc kubenswrapper[4724]: W1002 12:58:56.047275 4724 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Oct 02 12:58:56 crc kubenswrapper[4724]: W1002 12:58:56.047279 4724 feature_gate.go:330] unrecognized feature gate: InsightsConfig Oct 02 12:58:56 crc kubenswrapper[4724]: W1002 12:58:56.047283 4724 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Oct 02 12:58:56 crc kubenswrapper[4724]: W1002 12:58:56.047289 4724 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Oct 02 12:58:56 crc kubenswrapper[4724]: W1002 12:58:56.047295 4724 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Oct 02 12:58:56 crc kubenswrapper[4724]: W1002 12:58:56.047300 4724 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Oct 02 12:58:56 crc kubenswrapper[4724]: W1002 12:58:56.047305 4724 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Oct 02 12:58:56 crc kubenswrapper[4724]: W1002 12:58:56.047310 4724 feature_gate.go:330] unrecognized feature gate: SignatureStores Oct 02 12:58:56 crc kubenswrapper[4724]: W1002 12:58:56.047315 4724 feature_gate.go:330] unrecognized feature gate: NewOLM Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.047329 4724 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.064906 4724 server.go:491] "Kubelet version" kubeletVersion="v1.31.5" Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.064949 4724 server.go:493] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Oct 02 12:58:56 crc kubenswrapper[4724]: W1002 12:58:56.065013 4724 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Oct 02 12:58:56 crc kubenswrapper[4724]: W1002 12:58:56.065022 4724 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Oct 02 12:58:56 crc kubenswrapper[4724]: W1002 12:58:56.065026 4724 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Oct 02 12:58:56 crc kubenswrapper[4724]: W1002 12:58:56.065031 4724 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Oct 02 12:58:56 crc kubenswrapper[4724]: W1002 12:58:56.065034 4724 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Oct 02 12:58:56 crc kubenswrapper[4724]: W1002 12:58:56.065038 4724 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Oct 02 12:58:56 crc kubenswrapper[4724]: W1002 12:58:56.065044 4724 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Oct 02 12:58:56 crc kubenswrapper[4724]: W1002 12:58:56.065051 4724 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Oct 02 12:58:56 crc kubenswrapper[4724]: W1002 12:58:56.065055 4724 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Oct 02 12:58:56 crc kubenswrapper[4724]: W1002 12:58:56.065059 4724 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Oct 02 12:58:56 crc kubenswrapper[4724]: W1002 12:58:56.065063 4724 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Oct 02 12:58:56 crc kubenswrapper[4724]: W1002 12:58:56.065066 4724 feature_gate.go:330] unrecognized feature gate: PlatformOperators Oct 02 12:58:56 crc kubenswrapper[4724]: W1002 12:58:56.065070 4724 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Oct 02 12:58:56 crc kubenswrapper[4724]: W1002 12:58:56.065073 4724 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Oct 02 12:58:56 crc kubenswrapper[4724]: W1002 12:58:56.065077 4724 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Oct 02 12:58:56 crc kubenswrapper[4724]: W1002 12:58:56.065080 4724 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Oct 02 12:58:56 crc kubenswrapper[4724]: W1002 12:58:56.065089 4724 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Oct 02 12:58:56 crc kubenswrapper[4724]: W1002 12:58:56.065093 4724 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Oct 02 12:58:56 crc kubenswrapper[4724]: W1002 12:58:56.065096 4724 feature_gate.go:330] unrecognized feature gate: SignatureStores Oct 02 12:58:56 crc kubenswrapper[4724]: W1002 12:58:56.065101 4724 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Oct 02 12:58:56 crc kubenswrapper[4724]: W1002 12:58:56.065105 4724 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Oct 02 12:58:56 crc kubenswrapper[4724]: W1002 12:58:56.065108 4724 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Oct 02 12:58:56 crc kubenswrapper[4724]: W1002 12:58:56.065112 4724 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Oct 02 12:58:56 crc kubenswrapper[4724]: W1002 12:58:56.065115 4724 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Oct 02 12:58:56 crc kubenswrapper[4724]: W1002 12:58:56.065119 4724 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Oct 02 12:58:56 crc kubenswrapper[4724]: W1002 12:58:56.065122 4724 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Oct 02 12:58:56 crc kubenswrapper[4724]: W1002 12:58:56.065126 4724 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Oct 02 12:58:56 crc kubenswrapper[4724]: W1002 12:58:56.065129 4724 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Oct 02 12:58:56 crc kubenswrapper[4724]: W1002 12:58:56.065133 4724 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Oct 02 12:58:56 crc kubenswrapper[4724]: W1002 12:58:56.065137 4724 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Oct 02 12:58:56 crc kubenswrapper[4724]: W1002 12:58:56.065140 4724 feature_gate.go:330] unrecognized feature gate: NewOLM Oct 02 12:58:56 crc kubenswrapper[4724]: W1002 12:58:56.065145 4724 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Oct 02 12:58:56 crc kubenswrapper[4724]: W1002 12:58:56.065150 4724 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Oct 02 12:58:56 crc kubenswrapper[4724]: W1002 12:58:56.065154 4724 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Oct 02 12:58:56 crc kubenswrapper[4724]: W1002 12:58:56.065158 4724 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Oct 02 12:58:56 crc kubenswrapper[4724]: W1002 12:58:56.065163 4724 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Oct 02 12:58:56 crc kubenswrapper[4724]: W1002 12:58:56.065167 4724 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Oct 02 12:58:56 crc kubenswrapper[4724]: W1002 12:58:56.065171 4724 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Oct 02 12:58:56 crc kubenswrapper[4724]: W1002 12:58:56.065176 4724 feature_gate.go:330] unrecognized feature gate: OVNObservability Oct 02 12:58:56 crc kubenswrapper[4724]: W1002 12:58:56.065181 4724 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Oct 02 12:58:56 crc kubenswrapper[4724]: W1002 12:58:56.065190 4724 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Oct 02 12:58:56 crc kubenswrapper[4724]: W1002 12:58:56.065197 4724 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Oct 02 12:58:56 crc kubenswrapper[4724]: W1002 12:58:56.065204 4724 feature_gate.go:330] unrecognized feature gate: Example Oct 02 12:58:56 crc kubenswrapper[4724]: W1002 12:58:56.065209 4724 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Oct 02 12:58:56 crc kubenswrapper[4724]: W1002 12:58:56.065214 4724 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Oct 02 12:58:56 crc kubenswrapper[4724]: W1002 12:58:56.065218 4724 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Oct 02 12:58:56 crc kubenswrapper[4724]: W1002 12:58:56.065221 4724 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Oct 02 12:58:56 crc kubenswrapper[4724]: W1002 12:58:56.065225 4724 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Oct 02 12:58:56 crc kubenswrapper[4724]: W1002 12:58:56.065228 4724 feature_gate.go:330] unrecognized feature gate: GatewayAPI Oct 02 12:58:56 crc kubenswrapper[4724]: W1002 12:58:56.065232 4724 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Oct 02 12:58:56 crc kubenswrapper[4724]: W1002 12:58:56.065235 4724 feature_gate.go:330] unrecognized feature gate: InsightsConfig Oct 02 12:58:56 crc kubenswrapper[4724]: W1002 12:58:56.065239 4724 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Oct 02 12:58:56 crc kubenswrapper[4724]: W1002 12:58:56.065242 4724 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Oct 02 12:58:56 crc kubenswrapper[4724]: W1002 12:58:56.065246 4724 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Oct 02 12:58:56 crc kubenswrapper[4724]: W1002 12:58:56.065249 4724 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Oct 02 12:58:56 crc kubenswrapper[4724]: W1002 12:58:56.065254 4724 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Oct 02 12:58:56 crc kubenswrapper[4724]: W1002 12:58:56.065258 4724 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Oct 02 12:58:56 crc kubenswrapper[4724]: W1002 12:58:56.065262 4724 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Oct 02 12:58:56 crc kubenswrapper[4724]: W1002 12:58:56.065265 4724 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Oct 02 12:58:56 crc kubenswrapper[4724]: W1002 12:58:56.065269 4724 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Oct 02 12:58:56 crc kubenswrapper[4724]: W1002 12:58:56.065273 4724 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Oct 02 12:58:56 crc kubenswrapper[4724]: W1002 12:58:56.065278 4724 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Oct 02 12:58:56 crc kubenswrapper[4724]: W1002 12:58:56.065283 4724 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Oct 02 12:58:56 crc kubenswrapper[4724]: W1002 12:58:56.065287 4724 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Oct 02 12:58:56 crc kubenswrapper[4724]: W1002 12:58:56.065290 4724 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Oct 02 12:58:56 crc kubenswrapper[4724]: W1002 12:58:56.065294 4724 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Oct 02 12:58:56 crc kubenswrapper[4724]: W1002 12:58:56.065299 4724 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Oct 02 12:58:56 crc kubenswrapper[4724]: W1002 12:58:56.065303 4724 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Oct 02 12:58:56 crc kubenswrapper[4724]: W1002 12:58:56.065306 4724 feature_gate.go:330] unrecognized feature gate: PinnedImages Oct 02 12:58:56 crc kubenswrapper[4724]: W1002 12:58:56.065310 4724 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Oct 02 12:58:56 crc kubenswrapper[4724]: W1002 12:58:56.065313 4724 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.065320 4724 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Oct 02 12:58:56 crc kubenswrapper[4724]: W1002 12:58:56.065434 4724 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Oct 02 12:58:56 crc kubenswrapper[4724]: W1002 12:58:56.065442 4724 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Oct 02 12:58:56 crc kubenswrapper[4724]: W1002 12:58:56.065446 4724 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Oct 02 12:58:56 crc kubenswrapper[4724]: W1002 12:58:56.065452 4724 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Oct 02 12:58:56 crc kubenswrapper[4724]: W1002 12:58:56.065458 4724 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Oct 02 12:58:56 crc kubenswrapper[4724]: W1002 12:58:56.065463 4724 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Oct 02 12:58:56 crc kubenswrapper[4724]: W1002 12:58:56.065468 4724 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Oct 02 12:58:56 crc kubenswrapper[4724]: W1002 12:58:56.065473 4724 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Oct 02 12:58:56 crc kubenswrapper[4724]: W1002 12:58:56.065478 4724 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Oct 02 12:58:56 crc kubenswrapper[4724]: W1002 12:58:56.065482 4724 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Oct 02 12:58:56 crc kubenswrapper[4724]: W1002 12:58:56.065486 4724 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Oct 02 12:58:56 crc kubenswrapper[4724]: W1002 12:58:56.065491 4724 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Oct 02 12:58:56 crc kubenswrapper[4724]: W1002 12:58:56.065495 4724 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Oct 02 12:58:56 crc kubenswrapper[4724]: W1002 12:58:56.065499 4724 feature_gate.go:330] unrecognized feature gate: GatewayAPI Oct 02 12:58:56 crc kubenswrapper[4724]: W1002 12:58:56.065503 4724 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Oct 02 12:58:56 crc kubenswrapper[4724]: W1002 12:58:56.065508 4724 feature_gate.go:330] unrecognized feature gate: OVNObservability Oct 02 12:58:56 crc kubenswrapper[4724]: W1002 12:58:56.065511 4724 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Oct 02 12:58:56 crc kubenswrapper[4724]: W1002 12:58:56.065515 4724 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Oct 02 12:58:56 crc kubenswrapper[4724]: W1002 12:58:56.065519 4724 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Oct 02 12:58:56 crc kubenswrapper[4724]: W1002 12:58:56.065523 4724 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Oct 02 12:58:56 crc kubenswrapper[4724]: W1002 12:58:56.065526 4724 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Oct 02 12:58:56 crc kubenswrapper[4724]: W1002 12:58:56.065550 4724 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Oct 02 12:58:56 crc kubenswrapper[4724]: W1002 12:58:56.065562 4724 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Oct 02 12:58:56 crc kubenswrapper[4724]: W1002 12:58:56.065567 4724 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Oct 02 12:58:56 crc kubenswrapper[4724]: W1002 12:58:56.065571 4724 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Oct 02 12:58:56 crc kubenswrapper[4724]: W1002 12:58:56.065575 4724 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Oct 02 12:58:56 crc kubenswrapper[4724]: W1002 12:58:56.065580 4724 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Oct 02 12:58:56 crc kubenswrapper[4724]: W1002 12:58:56.065584 4724 feature_gate.go:330] unrecognized feature gate: InsightsConfig Oct 02 12:58:56 crc kubenswrapper[4724]: W1002 12:58:56.065588 4724 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Oct 02 12:58:56 crc kubenswrapper[4724]: W1002 12:58:56.065592 4724 feature_gate.go:330] unrecognized feature gate: NewOLM Oct 02 12:58:56 crc kubenswrapper[4724]: W1002 12:58:56.065596 4724 feature_gate.go:330] unrecognized feature gate: Example Oct 02 12:58:56 crc kubenswrapper[4724]: W1002 12:58:56.065599 4724 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Oct 02 12:58:56 crc kubenswrapper[4724]: W1002 12:58:56.065602 4724 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Oct 02 12:58:56 crc kubenswrapper[4724]: W1002 12:58:56.065606 4724 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Oct 02 12:58:56 crc kubenswrapper[4724]: W1002 12:58:56.065610 4724 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Oct 02 12:58:56 crc kubenswrapper[4724]: W1002 12:58:56.065615 4724 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Oct 02 12:58:56 crc kubenswrapper[4724]: W1002 12:58:56.065618 4724 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Oct 02 12:58:56 crc kubenswrapper[4724]: W1002 12:58:56.065624 4724 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Oct 02 12:58:56 crc kubenswrapper[4724]: W1002 12:58:56.065628 4724 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Oct 02 12:58:56 crc kubenswrapper[4724]: W1002 12:58:56.065631 4724 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Oct 02 12:58:56 crc kubenswrapper[4724]: W1002 12:58:56.065635 4724 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Oct 02 12:58:56 crc kubenswrapper[4724]: W1002 12:58:56.065639 4724 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Oct 02 12:58:56 crc kubenswrapper[4724]: W1002 12:58:56.065643 4724 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Oct 02 12:58:56 crc kubenswrapper[4724]: W1002 12:58:56.065646 4724 feature_gate.go:330] unrecognized feature gate: PinnedImages Oct 02 12:58:56 crc kubenswrapper[4724]: W1002 12:58:56.065650 4724 feature_gate.go:330] unrecognized feature gate: PlatformOperators Oct 02 12:58:56 crc kubenswrapper[4724]: W1002 12:58:56.065654 4724 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Oct 02 12:58:56 crc kubenswrapper[4724]: W1002 12:58:56.065659 4724 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Oct 02 12:58:56 crc kubenswrapper[4724]: W1002 12:58:56.065669 4724 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Oct 02 12:58:56 crc kubenswrapper[4724]: W1002 12:58:56.065676 4724 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Oct 02 12:58:56 crc kubenswrapper[4724]: W1002 12:58:56.065682 4724 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Oct 02 12:58:56 crc kubenswrapper[4724]: W1002 12:58:56.065686 4724 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Oct 02 12:58:56 crc kubenswrapper[4724]: W1002 12:58:56.065691 4724 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Oct 02 12:58:56 crc kubenswrapper[4724]: W1002 12:58:56.065697 4724 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Oct 02 12:58:56 crc kubenswrapper[4724]: W1002 12:58:56.065701 4724 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Oct 02 12:58:56 crc kubenswrapper[4724]: W1002 12:58:56.065706 4724 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Oct 02 12:58:56 crc kubenswrapper[4724]: W1002 12:58:56.065710 4724 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Oct 02 12:58:56 crc kubenswrapper[4724]: W1002 12:58:56.065715 4724 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Oct 02 12:58:56 crc kubenswrapper[4724]: W1002 12:58:56.065720 4724 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Oct 02 12:58:56 crc kubenswrapper[4724]: W1002 12:58:56.065725 4724 feature_gate.go:330] unrecognized feature gate: SignatureStores Oct 02 12:58:56 crc kubenswrapper[4724]: W1002 12:58:56.065729 4724 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Oct 02 12:58:56 crc kubenswrapper[4724]: W1002 12:58:56.065733 4724 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Oct 02 12:58:56 crc kubenswrapper[4724]: W1002 12:58:56.065737 4724 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Oct 02 12:58:56 crc kubenswrapper[4724]: W1002 12:58:56.065740 4724 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Oct 02 12:58:56 crc kubenswrapper[4724]: W1002 12:58:56.065744 4724 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Oct 02 12:58:56 crc kubenswrapper[4724]: W1002 12:58:56.065749 4724 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Oct 02 12:58:56 crc kubenswrapper[4724]: W1002 12:58:56.065753 4724 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Oct 02 12:58:56 crc kubenswrapper[4724]: W1002 12:58:56.065757 4724 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Oct 02 12:58:56 crc kubenswrapper[4724]: W1002 12:58:56.065761 4724 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Oct 02 12:58:56 crc kubenswrapper[4724]: W1002 12:58:56.065765 4724 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Oct 02 12:58:56 crc kubenswrapper[4724]: W1002 12:58:56.065769 4724 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Oct 02 12:58:56 crc kubenswrapper[4724]: W1002 12:58:56.065773 4724 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.065779 4724 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.065982 4724 server.go:940] "Client rotation is on, will bootstrap in background" Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.076601 4724 bootstrap.go:85] "Current kubeconfig file contents are still valid, no bootstrap necessary" Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.076714 4724 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.078926 4724 server.go:997] "Starting client certificate rotation" Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.078970 4724 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate rotation is enabled Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.079175 4724 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2026-02-24 05:52:08 +0000 UTC, rotation deadline is 2025-12-25 19:44:13.87469959 +0000 UTC Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.079244 4724 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Waiting 2022h45m17.795458839s for next certificate rotation Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.123168 4724 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.126081 4724 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.148216 4724 log.go:25] "Validated CRI v1 runtime API" Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.180526 4724 log.go:25] "Validated CRI v1 image API" Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.182519 4724 server.go:1437] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.188736 4724 fs.go:133] Filesystem UUIDs: map[0b076daa-c26a-46d2-b3a6-72a8dbc6e257:/dev/vda4 2025-10-02-12-54-09-00:/dev/sr0 7B77-95E7:/dev/vda2 de0497b0-db1b-465a-b278-03db02455c71:/dev/vda3] Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.188824 4724 fs.go:134] Filesystem partitions: map[/dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /dev/vda3:{mountpoint:/boot major:252 minor:3 fsType:ext4 blockSize:0} /dev/vda4:{mountpoint:/var major:252 minor:4 fsType:xfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /run/user/1000:{mountpoint:/run/user/1000 major:0 minor:41 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:30 fsType:tmpfs blockSize:0} /var/lib/etcd:{mountpoint:/var/lib/etcd major:0 minor:42 fsType:tmpfs blockSize:0}] Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.216233 4724 manager.go:217] Machine: {Timestamp:2025-10-02 12:58:56.211387419 +0000 UTC m=+0.666146560 CPUVendorID:AuthenticAMD NumCores:12 NumPhysicalCores:1 NumSockets:12 CpuFrequency:2800000 MemoryCapacity:33654124544 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:21801e6708c44f15b81395eb736a7cec SystemUUID:5e560baf-345b-4d65-984c-1cfbf6a74dd2 BootID:1650a7ad-7086-4fb1-9f9b-b9368db16424 Filesystems:[{Device:/dev/vda3 DeviceMajor:252 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/run/user/1000 DeviceMajor:0 DeviceMinor:41 Capacity:3365408768 Type:vfs Inodes:821633 HasInodes:true} {Device:/var/lib/etcd DeviceMajor:0 DeviceMinor:42 Capacity:1073741824 Type:vfs Inodes:4108169 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16827060224 Type:vfs Inodes:4108169 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6730825728 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/vda4 DeviceMajor:252 DeviceMinor:4 Capacity:85292941312 Type:vfs Inodes:41679680 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:30 Capacity:16827064320 Type:vfs Inodes:1048576 HasInodes:true}] DiskMap:map[252:0:{Name:vda Major:252 Minor:0 Size:214748364800 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:fa:16:3e:1f:73:30 Speed:0 Mtu:1500} {Name:br-int MacAddress:d6:39:55:2e:22:71 Speed:0 Mtu:1400} {Name:ens3 MacAddress:fa:16:3e:1f:73:30 Speed:-1 Mtu:1500} {Name:ens7 MacAddress:fa:16:3e:c8:86:d0 Speed:-1 Mtu:1500} {Name:ens7.20 MacAddress:52:54:00:d9:11:ea Speed:-1 Mtu:1496} {Name:ens7.21 MacAddress:52:54:00:2f:d0:53 Speed:-1 Mtu:1496} {Name:ens7.22 MacAddress:52:54:00:6b:29:f6 Speed:-1 Mtu:1496} {Name:eth10 MacAddress:02:bf:f4:a3:65:d9 Speed:0 Mtu:1500} {Name:ovn-k8s-mp0 MacAddress:0a:58:0a:d9:00:02 Speed:0 Mtu:1400} {Name:ovs-system MacAddress:de:c8:83:45:12:bb Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33654124544 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:0 Size:16777216 Type:Unified Level:3}] SocketID:0 BookID: DrawerID:} {Id:0 Threads:[1] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:1 Size:16777216 Type:Unified Level:3}] SocketID:1 BookID: DrawerID:} {Id:0 Threads:[10] Caches:[{Id:10 Size:32768 Type:Data Level:1} {Id:10 Size:32768 Type:Instruction Level:1} {Id:10 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:10 Size:16777216 Type:Unified Level:3}] SocketID:10 BookID: DrawerID:} {Id:0 Threads:[11] Caches:[{Id:11 Size:32768 Type:Data Level:1} {Id:11 Size:32768 Type:Instruction Level:1} {Id:11 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:11 Size:16777216 Type:Unified Level:3}] SocketID:11 BookID: DrawerID:} {Id:0 Threads:[2] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:2 Size:16777216 Type:Unified Level:3}] SocketID:2 BookID: DrawerID:} {Id:0 Threads:[3] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:3 Size:16777216 Type:Unified Level:3}] SocketID:3 BookID: DrawerID:} {Id:0 Threads:[4] Caches:[{Id:4 Size:32768 Type:Data Level:1} {Id:4 Size:32768 Type:Instruction Level:1} {Id:4 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:4 Size:16777216 Type:Unified Level:3}] SocketID:4 BookID: DrawerID:} {Id:0 Threads:[5] Caches:[{Id:5 Size:32768 Type:Data Level:1} {Id:5 Size:32768 Type:Instruction Level:1} {Id:5 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:5 Size:16777216 Type:Unified Level:3}] SocketID:5 BookID: DrawerID:} {Id:0 Threads:[6] Caches:[{Id:6 Size:32768 Type:Data Level:1} {Id:6 Size:32768 Type:Instruction Level:1} {Id:6 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:6 Size:16777216 Type:Unified Level:3}] SocketID:6 BookID: DrawerID:} {Id:0 Threads:[7] Caches:[{Id:7 Size:32768 Type:Data Level:1} {Id:7 Size:32768 Type:Instruction Level:1} {Id:7 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:7 Size:16777216 Type:Unified Level:3}] SocketID:7 BookID: DrawerID:} {Id:0 Threads:[8] Caches:[{Id:8 Size:32768 Type:Data Level:1} {Id:8 Size:32768 Type:Instruction Level:1} {Id:8 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:8 Size:16777216 Type:Unified Level:3}] SocketID:8 BookID: DrawerID:} {Id:0 Threads:[9] Caches:[{Id:9 Size:32768 Type:Data Level:1} {Id:9 Size:32768 Type:Instruction Level:1} {Id:9 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:9 Size:16777216 Type:Unified Level:3}] SocketID:9 BookID: DrawerID:}] Caches:[] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.217015 4724 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.217276 4724 manager.go:233] Version: {KernelVersion:5.14.0-427.50.2.el9_4.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 418.94.202502100215-0 DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.218006 4724 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.218272 4724 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.218329 4724 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"crc","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"200m","ephemeral-storage":"350Mi","memory":"350Mi"},"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.218594 4724 topology_manager.go:138] "Creating topology manager with none policy" Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.218604 4724 container_manager_linux.go:303] "Creating device plugin manager" Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.219447 4724 manager.go:142] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.219478 4724 server.go:66] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.220886 4724 state_mem.go:36] "Initialized new in-memory state store" Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.221021 4724 server.go:1245] "Using root directory" path="/var/lib/kubelet" Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.226579 4724 kubelet.go:418] "Attempting to sync node with API server" Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.226628 4724 kubelet.go:313] "Adding static pod path" path="/etc/kubernetes/manifests" Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.226657 4724 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.226705 4724 kubelet.go:324] "Adding apiserver pod source" Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.226723 4724 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.232020 4724 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="cri-o" version="1.31.5-4.rhaos4.18.gitdad78d5.el9" apiVersion="v1" Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.232862 4724 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-server-current.pem". Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.236580 4724 kubelet.go:854] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Oct 02 12:58:56 crc kubenswrapper[4724]: W1002 12:58:56.237594 4724 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.233:6443: connect: connection refused Oct 02 12:58:56 crc kubenswrapper[4724]: W1002 12:58:56.237637 4724 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.233:6443: connect: connection refused Oct 02 12:58:56 crc kubenswrapper[4724]: E1002 12:58:56.237684 4724 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.233:6443: connect: connection refused" logger="UnhandledError" Oct 02 12:58:56 crc kubenswrapper[4724]: E1002 12:58:56.237685 4724 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.233:6443: connect: connection refused" logger="UnhandledError" Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.238395 4724 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.238435 4724 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.238449 4724 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.238464 4724 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.238485 4724 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.238499 4724 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/secret" Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.238514 4724 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.238564 4724 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.238583 4724 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/fc" Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.238597 4724 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.238617 4724 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/projected" Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.238632 4724 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.240575 4724 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/csi" Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.241008 4724 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.233:6443: connect: connection refused Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.241342 4724 server.go:1280] "Started kubelet" Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.241553 4724 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.241526 4724 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.242180 4724 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Oct 02 12:58:56 crc systemd[1]: Started Kubernetes Kubelet. Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.244051 4724 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate rotation is enabled Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.244099 4724 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Oct 02 12:58:56 crc kubenswrapper[4724]: E1002 12:58:56.244276 4724 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.244443 4724 volume_manager.go:287] "The desired_state_of_world populator starts" Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.250521 4724 volume_manager.go:289] "Starting Kubelet Volume Manager" Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.244131 4724 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-06 01:16:18.770133367 +0000 UTC Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.251266 4724 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 2292h17m22.518890894s for next certificate rotation Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.244473 4724 desired_state_of_world_populator.go:146] "Desired state populator starts to run" Oct 02 12:58:56 crc kubenswrapper[4724]: E1002 12:58:56.251668 4724 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.233:6443: connect: connection refused" interval="200ms" Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.252610 4724 factory.go:219] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.252640 4724 factory.go:55] Registering systemd factory Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.252652 4724 factory.go:221] Registration of the systemd container factory successfully Oct 02 12:58:56 crc kubenswrapper[4724]: W1002 12:58:56.253403 4724 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.233:6443: connect: connection refused Oct 02 12:58:56 crc kubenswrapper[4724]: E1002 12:58:56.253572 4724 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.233:6443: connect: connection refused" logger="UnhandledError" Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.254808 4724 factory.go:153] Registering CRI-O factory Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.254921 4724 factory.go:221] Registration of the crio container factory successfully Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.254962 4724 factory.go:103] Registering Raw factory Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.254980 4724 manager.go:1196] Started watching for new ooms in manager Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.259952 4724 server.go:460] "Adding debug handlers to kubelet server" Oct 02 12:58:56 crc kubenswrapper[4724]: E1002 12:58:56.254625 4724 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": dial tcp 38.102.83.233:6443: connect: connection refused" event="&Event{ObjectMeta:{crc.186aae03e7f461fd default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2025-10-02 12:58:56.241287677 +0000 UTC m=+0.696046848,LastTimestamp:2025-10-02 12:58:56.241287677 +0000 UTC m=+0.696046848,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.264641 4724 manager.go:319] Starting recovery of all containers Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.270568 4724 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" seLinuxMountContext="" Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.270793 4724 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" seLinuxMountContext="" Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.270889 4724 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" seLinuxMountContext="" Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.270979 4724 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" seLinuxMountContext="" Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.271055 4724 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" seLinuxMountContext="" Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.271133 4724 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" seLinuxMountContext="" Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.271191 4724 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" seLinuxMountContext="" Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.271264 4724 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" seLinuxMountContext="" Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.271332 4724 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" seLinuxMountContext="" Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.271393 4724 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" seLinuxMountContext="" Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.271455 4724 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" seLinuxMountContext="" Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.271514 4724 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" seLinuxMountContext="" Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.271616 4724 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" seLinuxMountContext="" Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.271680 4724 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" seLinuxMountContext="" Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.271740 4724 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" seLinuxMountContext="" Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.271798 4724 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" seLinuxMountContext="" Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.271862 4724 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" seLinuxMountContext="" Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.271920 4724 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" seLinuxMountContext="" Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.271975 4724 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" seLinuxMountContext="" Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.272029 4724 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" seLinuxMountContext="" Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.272114 4724 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" seLinuxMountContext="" Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.272177 4724 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" seLinuxMountContext="" Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.272277 4724 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" seLinuxMountContext="" Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.272341 4724 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" seLinuxMountContext="" Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.272404 4724 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" seLinuxMountContext="" Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.272459 4724 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" seLinuxMountContext="" Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.272552 4724 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" seLinuxMountContext="" Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.272670 4724 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" seLinuxMountContext="" Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.272747 4724 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" seLinuxMountContext="" Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.272823 4724 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" seLinuxMountContext="" Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.276217 4724 reconstruct.go:144] "Volume is marked device as uncertain and added into the actual state" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" deviceMountPath="/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount" Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.276259 4724 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" seLinuxMountContext="" Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.276278 4724 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" seLinuxMountContext="" Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.276294 4724 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" seLinuxMountContext="" Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.276309 4724 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" seLinuxMountContext="" Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.276323 4724 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" seLinuxMountContext="" Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.276337 4724 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" seLinuxMountContext="" Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.276352 4724 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" seLinuxMountContext="" Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.276366 4724 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" seLinuxMountContext="" Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.276381 4724 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" seLinuxMountContext="" Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.276402 4724 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" volumeName="kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" seLinuxMountContext="" Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.276416 4724 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" seLinuxMountContext="" Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.276431 4724 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" seLinuxMountContext="" Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.276446 4724 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" seLinuxMountContext="" Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.276460 4724 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" seLinuxMountContext="" Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.276478 4724 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" seLinuxMountContext="" Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.276493 4724 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert" seLinuxMountContext="" Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.276509 4724 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" seLinuxMountContext="" Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.276528 4724 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" seLinuxMountContext="" Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.276561 4724 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" seLinuxMountContext="" Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.276578 4724 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" seLinuxMountContext="" Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.276594 4724 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" seLinuxMountContext="" Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.276611 4724 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" seLinuxMountContext="" Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.276646 4724 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" seLinuxMountContext="" Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.276663 4724 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides" seLinuxMountContext="" Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.276678 4724 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" seLinuxMountContext="" Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.276700 4724 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" seLinuxMountContext="" Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.276717 4724 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" seLinuxMountContext="" Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.276734 4724 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" seLinuxMountContext="" Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.276748 4724 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" seLinuxMountContext="" Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.276764 4724 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" seLinuxMountContext="" Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.276782 4724 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" seLinuxMountContext="" Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.276796 4724 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" seLinuxMountContext="" Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.276810 4724 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" seLinuxMountContext="" Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.276827 4724 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" seLinuxMountContext="" Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.276842 4724 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" seLinuxMountContext="" Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.276856 4724 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" seLinuxMountContext="" Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.276870 4724 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" seLinuxMountContext="" Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.276886 4724 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" seLinuxMountContext="" Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.276900 4724 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" seLinuxMountContext="" Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.276916 4724 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d751cbb-f2e2-430d-9754-c882a5e924a5" volumeName="kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl" seLinuxMountContext="" Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.276930 4724 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" seLinuxMountContext="" Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.276950 4724 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" seLinuxMountContext="" Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.276965 4724 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" seLinuxMountContext="" Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.276981 4724 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" seLinuxMountContext="" Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.276995 4724 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" seLinuxMountContext="" Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.277010 4724 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" seLinuxMountContext="" Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.277024 4724 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" seLinuxMountContext="" Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.277038 4724 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5" seLinuxMountContext="" Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.277053 4724 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" seLinuxMountContext="" Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.277071 4724 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" seLinuxMountContext="" Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.277087 4724 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" seLinuxMountContext="" Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.277105 4724 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" seLinuxMountContext="" Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.277122 4724 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" seLinuxMountContext="" Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.277137 4724 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" seLinuxMountContext="" Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.277151 4724 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm" seLinuxMountContext="" Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.277170 4724 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" seLinuxMountContext="" Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.277186 4724 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" seLinuxMountContext="" Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.277204 4724 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" seLinuxMountContext="" Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.277220 4724 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" seLinuxMountContext="" Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.277234 4724 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" seLinuxMountContext="" Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.277247 4724 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf" seLinuxMountContext="" Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.277262 4724 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" seLinuxMountContext="" Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.277279 4724 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" seLinuxMountContext="" Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.277294 4724 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" seLinuxMountContext="" Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.277309 4724 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" seLinuxMountContext="" Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.277323 4724 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" seLinuxMountContext="" Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.277339 4724 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" seLinuxMountContext="" Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.277353 4724 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" seLinuxMountContext="" Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.277369 4724 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" seLinuxMountContext="" Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.277385 4724 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" seLinuxMountContext="" Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.277398 4724 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" seLinuxMountContext="" Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.277415 4724 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" seLinuxMountContext="" Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.277430 4724 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" seLinuxMountContext="" Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.277446 4724 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" seLinuxMountContext="" Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.277486 4724 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" seLinuxMountContext="" Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.277503 4724 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" seLinuxMountContext="" Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.277520 4724 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" seLinuxMountContext="" Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.277585 4724 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" seLinuxMountContext="" Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.277609 4724 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" seLinuxMountContext="" Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.277626 4724 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf" seLinuxMountContext="" Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.277641 4724 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls" seLinuxMountContext="" Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.277659 4724 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" seLinuxMountContext="" Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.277674 4724 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" seLinuxMountContext="" Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.277690 4724 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" seLinuxMountContext="" Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.277705 4724 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" seLinuxMountContext="" Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.277719 4724 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" seLinuxMountContext="" Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.277732 4724 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" seLinuxMountContext="" Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.277747 4724 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" seLinuxMountContext="" Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.277764 4724 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" seLinuxMountContext="" Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.277787 4724 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" seLinuxMountContext="" Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.277802 4724 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" seLinuxMountContext="" Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.277816 4724 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" seLinuxMountContext="" Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.277830 4724 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" seLinuxMountContext="" Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.277844 4724 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" seLinuxMountContext="" Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.277858 4724 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" seLinuxMountContext="" Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.277872 4724 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" seLinuxMountContext="" Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.277888 4724 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" seLinuxMountContext="" Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.277902 4724 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" seLinuxMountContext="" Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.277918 4724 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" seLinuxMountContext="" Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.277932 4724 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" seLinuxMountContext="" Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.277947 4724 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49ef4625-1d3a-4a9f-b595-c2433d32326d" volumeName="kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" seLinuxMountContext="" Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.277962 4724 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" seLinuxMountContext="" Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.277976 4724 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" seLinuxMountContext="" Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.278014 4724 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" seLinuxMountContext="" Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.278031 4724 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" seLinuxMountContext="" Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.278050 4724 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" seLinuxMountContext="" Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.278065 4724 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" seLinuxMountContext="" Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.278080 4724 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" seLinuxMountContext="" Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.278093 4724 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" seLinuxMountContext="" Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.278107 4724 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" seLinuxMountContext="" Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.278125 4724 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" seLinuxMountContext="" Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.278140 4724 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" seLinuxMountContext="" Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.278153 4724 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" seLinuxMountContext="" Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.278169 4724 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert" seLinuxMountContext="" Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.278184 4724 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" seLinuxMountContext="" Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.278200 4724 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" seLinuxMountContext="" Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.278217 4724 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" seLinuxMountContext="" Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.278230 4724 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" seLinuxMountContext="" Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.278246 4724 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" seLinuxMountContext="" Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.278262 4724 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" seLinuxMountContext="" Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.278276 4724 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" seLinuxMountContext="" Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.278293 4724 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" seLinuxMountContext="" Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.278309 4724 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" seLinuxMountContext="" Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.278323 4724 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" seLinuxMountContext="" Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.278338 4724 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" seLinuxMountContext="" Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.278351 4724 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" seLinuxMountContext="" Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.278366 4724 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" seLinuxMountContext="" Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.278381 4724 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script" seLinuxMountContext="" Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.278394 4724 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" seLinuxMountContext="" Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.278410 4724 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" seLinuxMountContext="" Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.278423 4724 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" seLinuxMountContext="" Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.278437 4724 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" seLinuxMountContext="" Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.278450 4724 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" seLinuxMountContext="" Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.278465 4724 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" seLinuxMountContext="" Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.278479 4724 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" seLinuxMountContext="" Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.278493 4724 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" seLinuxMountContext="" Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.278509 4724 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" seLinuxMountContext="" Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.278523 4724 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" seLinuxMountContext="" Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.278556 4724 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" seLinuxMountContext="" Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.278570 4724 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" seLinuxMountContext="" Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.278586 4724 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" seLinuxMountContext="" Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.278598 4724 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" seLinuxMountContext="" Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.278612 4724 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" seLinuxMountContext="" Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.278625 4724 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" seLinuxMountContext="" Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.278638 4724 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" seLinuxMountContext="" Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.278650 4724 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb" seLinuxMountContext="" Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.278662 4724 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" volumeName="kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" seLinuxMountContext="" Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.278678 4724 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" seLinuxMountContext="" Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.278725 4724 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" seLinuxMountContext="" Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.278740 4724 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" seLinuxMountContext="" Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.278756 4724 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="44663579-783b-4372-86d6-acf235a62d72" volumeName="kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" seLinuxMountContext="" Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.278769 4724 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" seLinuxMountContext="" Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.278784 4724 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" seLinuxMountContext="" Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.278801 4724 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" seLinuxMountContext="" Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.278814 4724 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" seLinuxMountContext="" Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.278829 4724 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" seLinuxMountContext="" Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.278847 4724 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" seLinuxMountContext="" Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.278862 4724 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" seLinuxMountContext="" Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.278876 4724 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" seLinuxMountContext="" Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.278890 4724 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" seLinuxMountContext="" Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.278905 4724 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" seLinuxMountContext="" Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.278919 4724 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3b6479f0-333b-4a96-9adf-2099afdc2447" volumeName="kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr" seLinuxMountContext="" Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.278934 4724 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" seLinuxMountContext="" Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.278947 4724 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" seLinuxMountContext="" Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.278963 4724 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" seLinuxMountContext="" Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.278976 4724 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" seLinuxMountContext="" Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.278991 4724 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" seLinuxMountContext="" Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.279005 4724 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" seLinuxMountContext="" Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.279020 4724 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" seLinuxMountContext="" Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.279035 4724 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" seLinuxMountContext="" Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.279048 4724 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" seLinuxMountContext="" Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.279061 4724 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" seLinuxMountContext="" Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.279076 4724 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" seLinuxMountContext="" Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.279090 4724 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" seLinuxMountContext="" Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.279104 4724 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" seLinuxMountContext="" Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.279117 4724 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" seLinuxMountContext="" Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.279130 4724 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" seLinuxMountContext="" Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.279142 4724 reconstruct.go:97] "Volume reconstruction finished" Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.279152 4724 reconciler.go:26] "Reconciler: start to sync state" Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.283514 4724 manager.go:324] Recovery completed Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.299782 4724 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.301807 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.302235 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.302247 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.304072 4724 cpu_manager.go:225] "Starting CPU manager" policy="none" Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.304105 4724 cpu_manager.go:226] "Reconciling" reconcilePeriod="10s" Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.304132 4724 state_mem.go:36] "Initialized new in-memory state store" Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.309872 4724 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.312266 4724 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.312325 4724 status_manager.go:217] "Starting to sync pod status with apiserver" Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.312363 4724 kubelet.go:2335] "Starting kubelet main sync loop" Oct 02 12:58:56 crc kubenswrapper[4724]: E1002 12:58:56.312422 4724 kubelet.go:2359] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Oct 02 12:58:56 crc kubenswrapper[4724]: W1002 12:58:56.313419 4724 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.233:6443: connect: connection refused Oct 02 12:58:56 crc kubenswrapper[4724]: E1002 12:58:56.313513 4724 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.233:6443: connect: connection refused" logger="UnhandledError" Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.322630 4724 policy_none.go:49] "None policy: Start" Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.323712 4724 memory_manager.go:170] "Starting memorymanager" policy="None" Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.323736 4724 state_mem.go:35] "Initializing new in-memory state store" Oct 02 12:58:56 crc kubenswrapper[4724]: E1002 12:58:56.344365 4724 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.375021 4724 manager.go:334] "Starting Device Plugin manager" Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.375089 4724 manager.go:513] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.375105 4724 server.go:79] "Starting device plugin registration server" Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.375612 4724 eviction_manager.go:189] "Eviction manager: starting control loop" Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.375634 4724 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.376139 4724 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.376300 4724 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.376314 4724 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Oct 02 12:58:56 crc kubenswrapper[4724]: E1002 12:58:56.382249 4724 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.413484 4724 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc","openshift-etcd/etcd-crc","openshift-kube-apiserver/kube-apiserver-crc","openshift-kube-controller-manager/kube-controller-manager-crc","openshift-kube-scheduler/openshift-kube-scheduler-crc"] Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.413627 4724 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.415155 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.415226 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.415256 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.415474 4724 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.415744 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.415781 4724 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.416583 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.416628 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.416633 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.416666 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.416681 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.416640 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.416823 4724 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.416973 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.417014 4724 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.417683 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.417712 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.417724 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.417833 4724 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.417967 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.417995 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.418010 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.418283 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.418318 4724 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.418692 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.418712 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.418721 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.418861 4724 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.419040 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.419095 4724 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.419664 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.419683 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.419702 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.419725 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.419714 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.419816 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.419864 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.419894 4724 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.420241 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.420278 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.420286 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.420454 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.420484 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.420496 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 12:58:56 crc kubenswrapper[4724]: E1002 12:58:56.452652 4724 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.233:6443: connect: connection refused" interval="400ms" Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.476475 4724 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.477982 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.478030 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.478047 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.478080 4724 kubelet_node_status.go:76] "Attempting to register node" node="crc" Oct 02 12:58:56 crc kubenswrapper[4724]: E1002 12:58:56.478669 4724 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.233:6443: connect: connection refused" node="crc" Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.481550 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.481594 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.481691 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.481724 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.481749 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.482040 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.482064 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.482084 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.482131 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.482149 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.482169 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.482273 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.482332 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.482375 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.482401 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.584248 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.584339 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.584357 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.584375 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.584400 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.584417 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.584432 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.584448 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.584472 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.584489 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.584504 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.584520 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.584571 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.584592 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.584608 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.584614 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.584684 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.584727 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.584691 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.584754 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.584618 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.584759 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.584997 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.585002 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.585006 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.584989 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.585051 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.584975 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.585070 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.584640 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.679316 4724 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.680748 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.680781 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.680814 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.680843 4724 kubelet_node_status.go:76] "Attempting to register node" node="crc" Oct 02 12:58:56 crc kubenswrapper[4724]: E1002 12:58:56.681590 4724 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.233:6443: connect: connection refused" node="crc" Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.750336 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.755977 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.762028 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.803467 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 02 12:58:56 crc kubenswrapper[4724]: I1002 12:58:56.809950 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 02 12:58:56 crc kubenswrapper[4724]: W1002 12:58:56.844998 4724 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd1b160f5dda77d281dd8e69ec8d817f9.slice/crio-1f0ddebfbb0fe76a6880f05053822d9a061802631f3ef639caad0418539ec9b4 WatchSource:0}: Error finding container 1f0ddebfbb0fe76a6880f05053822d9a061802631f3ef639caad0418539ec9b4: Status 404 returned error can't find the container with id 1f0ddebfbb0fe76a6880f05053822d9a061802631f3ef639caad0418539ec9b4 Oct 02 12:58:56 crc kubenswrapper[4724]: W1002 12:58:56.848212 4724 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2139d3e2895fc6797b9c76a1b4c9886d.slice/crio-18fe23a49b7c52bc4741336c48d84be24801aef168fdd3cd834829e607d319a8 WatchSource:0}: Error finding container 18fe23a49b7c52bc4741336c48d84be24801aef168fdd3cd834829e607d319a8: Status 404 returned error can't find the container with id 18fe23a49b7c52bc4741336c48d84be24801aef168fdd3cd834829e607d319a8 Oct 02 12:58:56 crc kubenswrapper[4724]: E1002 12:58:56.854091 4724 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.233:6443: connect: connection refused" interval="800ms" Oct 02 12:58:56 crc kubenswrapper[4724]: W1002 12:58:56.865285 4724 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3dcd261975c3d6b9a6ad6367fd4facd3.slice/crio-4cafb53d6370d53b62ee7088105d7ca50cac4288e0682edd94676d1778e4b1c3 WatchSource:0}: Error finding container 4cafb53d6370d53b62ee7088105d7ca50cac4288e0682edd94676d1778e4b1c3: Status 404 returned error can't find the container with id 4cafb53d6370d53b62ee7088105d7ca50cac4288e0682edd94676d1778e4b1c3 Oct 02 12:58:56 crc kubenswrapper[4724]: W1002 12:58:56.871885 4724 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf4b27818a5e8e43d0dc095d08835c792.slice/crio-46957868cf27cab1b75b034bc7f9a63be1dbb7966b46f7d91a725c8f77dce20e WatchSource:0}: Error finding container 46957868cf27cab1b75b034bc7f9a63be1dbb7966b46f7d91a725c8f77dce20e: Status 404 returned error can't find the container with id 46957868cf27cab1b75b034bc7f9a63be1dbb7966b46f7d91a725c8f77dce20e Oct 02 12:58:56 crc kubenswrapper[4724]: W1002 12:58:56.877437 4724 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf614b9022728cf315e60c057852e563e.slice/crio-a71e04c954615db74af33f3f9e2ac0856dddd97fae95fe572d69a35f974befc4 WatchSource:0}: Error finding container a71e04c954615db74af33f3f9e2ac0856dddd97fae95fe572d69a35f974befc4: Status 404 returned error can't find the container with id a71e04c954615db74af33f3f9e2ac0856dddd97fae95fe572d69a35f974befc4 Oct 02 12:58:57 crc kubenswrapper[4724]: I1002 12:58:57.082396 4724 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 02 12:58:57 crc kubenswrapper[4724]: I1002 12:58:57.084037 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 12:58:57 crc kubenswrapper[4724]: I1002 12:58:57.084089 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 12:58:57 crc kubenswrapper[4724]: I1002 12:58:57.084101 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 12:58:57 crc kubenswrapper[4724]: I1002 12:58:57.084133 4724 kubelet_node_status.go:76] "Attempting to register node" node="crc" Oct 02 12:58:57 crc kubenswrapper[4724]: E1002 12:58:57.084864 4724 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.233:6443: connect: connection refused" node="crc" Oct 02 12:58:57 crc kubenswrapper[4724]: W1002 12:58:57.217009 4724 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.233:6443: connect: connection refused Oct 02 12:58:57 crc kubenswrapper[4724]: E1002 12:58:57.217127 4724 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.233:6443: connect: connection refused" logger="UnhandledError" Oct 02 12:58:57 crc kubenswrapper[4724]: W1002 12:58:57.236795 4724 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.233:6443: connect: connection refused Oct 02 12:58:57 crc kubenswrapper[4724]: E1002 12:58:57.236933 4724 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.233:6443: connect: connection refused" logger="UnhandledError" Oct 02 12:58:57 crc kubenswrapper[4724]: I1002 12:58:57.242129 4724 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.233:6443: connect: connection refused Oct 02 12:58:57 crc kubenswrapper[4724]: I1002 12:58:57.318839 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"1f0ddebfbb0fe76a6880f05053822d9a061802631f3ef639caad0418539ec9b4"} Oct 02 12:58:57 crc kubenswrapper[4724]: I1002 12:58:57.320479 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"a71e04c954615db74af33f3f9e2ac0856dddd97fae95fe572d69a35f974befc4"} Oct 02 12:58:57 crc kubenswrapper[4724]: I1002 12:58:57.321841 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"46957868cf27cab1b75b034bc7f9a63be1dbb7966b46f7d91a725c8f77dce20e"} Oct 02 12:58:57 crc kubenswrapper[4724]: I1002 12:58:57.323603 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"4cafb53d6370d53b62ee7088105d7ca50cac4288e0682edd94676d1778e4b1c3"} Oct 02 12:58:57 crc kubenswrapper[4724]: I1002 12:58:57.325310 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"18fe23a49b7c52bc4741336c48d84be24801aef168fdd3cd834829e607d319a8"} Oct 02 12:58:57 crc kubenswrapper[4724]: W1002 12:58:57.653122 4724 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.233:6443: connect: connection refused Oct 02 12:58:57 crc kubenswrapper[4724]: E1002 12:58:57.653233 4724 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.233:6443: connect: connection refused" logger="UnhandledError" Oct 02 12:58:57 crc kubenswrapper[4724]: E1002 12:58:57.654439 4724 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.233:6443: connect: connection refused" interval="1.6s" Oct 02 12:58:57 crc kubenswrapper[4724]: W1002 12:58:57.805404 4724 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.233:6443: connect: connection refused Oct 02 12:58:57 crc kubenswrapper[4724]: E1002 12:58:57.806008 4724 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.233:6443: connect: connection refused" logger="UnhandledError" Oct 02 12:58:57 crc kubenswrapper[4724]: I1002 12:58:57.885082 4724 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 02 12:58:57 crc kubenswrapper[4724]: I1002 12:58:57.886785 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 12:58:57 crc kubenswrapper[4724]: I1002 12:58:57.886828 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 12:58:57 crc kubenswrapper[4724]: I1002 12:58:57.886839 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 12:58:57 crc kubenswrapper[4724]: I1002 12:58:57.886864 4724 kubelet_node_status.go:76] "Attempting to register node" node="crc" Oct 02 12:58:57 crc kubenswrapper[4724]: E1002 12:58:57.887433 4724 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.233:6443: connect: connection refused" node="crc" Oct 02 12:58:58 crc kubenswrapper[4724]: I1002 12:58:58.242347 4724 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.233:6443: connect: connection refused Oct 02 12:58:58 crc kubenswrapper[4724]: I1002 12:58:58.330628 4724 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="87f8a67a501746cac3972c9889b37545d776c7d31aac20eab9bce7e9cb5a28b9" exitCode=0 Oct 02 12:58:58 crc kubenswrapper[4724]: I1002 12:58:58.330720 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"87f8a67a501746cac3972c9889b37545d776c7d31aac20eab9bce7e9cb5a28b9"} Oct 02 12:58:58 crc kubenswrapper[4724]: I1002 12:58:58.330777 4724 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 02 12:58:58 crc kubenswrapper[4724]: I1002 12:58:58.331694 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 12:58:58 crc kubenswrapper[4724]: I1002 12:58:58.331733 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 12:58:58 crc kubenswrapper[4724]: I1002 12:58:58.331744 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 12:58:58 crc kubenswrapper[4724]: I1002 12:58:58.332843 4724 generic.go:334] "Generic (PLEG): container finished" podID="d1b160f5dda77d281dd8e69ec8d817f9" containerID="0a7cab088cdfb74745e054222be93aa9d7639f5d200570d867608b2e04f6a0e4" exitCode=0 Oct 02 12:58:58 crc kubenswrapper[4724]: I1002 12:58:58.332887 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerDied","Data":"0a7cab088cdfb74745e054222be93aa9d7639f5d200570d867608b2e04f6a0e4"} Oct 02 12:58:58 crc kubenswrapper[4724]: I1002 12:58:58.332991 4724 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 02 12:58:58 crc kubenswrapper[4724]: I1002 12:58:58.334465 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 12:58:58 crc kubenswrapper[4724]: I1002 12:58:58.334489 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 12:58:58 crc kubenswrapper[4724]: I1002 12:58:58.334499 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 12:58:58 crc kubenswrapper[4724]: I1002 12:58:58.335260 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"8d0d24fc1f56963b40bca48e3e923a5b35f9bec48f0b4fabbbdc9163762b2aa6"} Oct 02 12:58:58 crc kubenswrapper[4724]: I1002 12:58:58.335289 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"cdf23aa5cabd12ba0af299a7184df0341b5a7f7fb6f8159a10c932599fcef08b"} Oct 02 12:58:58 crc kubenswrapper[4724]: I1002 12:58:58.336614 4724 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="366ed0b9a29ba0b7606888087909b51b39abd60dd6b11e6509b8613913461b6b" exitCode=0 Oct 02 12:58:58 crc kubenswrapper[4724]: I1002 12:58:58.336688 4724 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 02 12:58:58 crc kubenswrapper[4724]: I1002 12:58:58.336688 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"366ed0b9a29ba0b7606888087909b51b39abd60dd6b11e6509b8613913461b6b"} Oct 02 12:58:58 crc kubenswrapper[4724]: I1002 12:58:58.338579 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 12:58:58 crc kubenswrapper[4724]: I1002 12:58:58.338613 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 12:58:58 crc kubenswrapper[4724]: I1002 12:58:58.338626 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 12:58:58 crc kubenswrapper[4724]: I1002 12:58:58.339575 4724 generic.go:334] "Generic (PLEG): container finished" podID="3dcd261975c3d6b9a6ad6367fd4facd3" containerID="a8e76f0ac3479d575a0b073c5b9ad857711fc06bca31e147167babe0fb164614" exitCode=0 Oct 02 12:58:58 crc kubenswrapper[4724]: I1002 12:58:58.339621 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerDied","Data":"a8e76f0ac3479d575a0b073c5b9ad857711fc06bca31e147167babe0fb164614"} Oct 02 12:58:58 crc kubenswrapper[4724]: I1002 12:58:58.339730 4724 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 02 12:58:58 crc kubenswrapper[4724]: I1002 12:58:58.340733 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 12:58:58 crc kubenswrapper[4724]: I1002 12:58:58.340764 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 12:58:58 crc kubenswrapper[4724]: I1002 12:58:58.340774 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 12:58:58 crc kubenswrapper[4724]: I1002 12:58:58.341313 4724 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 02 12:58:58 crc kubenswrapper[4724]: I1002 12:58:58.342456 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 12:58:58 crc kubenswrapper[4724]: I1002 12:58:58.342481 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 12:58:58 crc kubenswrapper[4724]: I1002 12:58:58.342491 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 12:58:59 crc kubenswrapper[4724]: I1002 12:58:59.242440 4724 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.233:6443: connect: connection refused Oct 02 12:58:59 crc kubenswrapper[4724]: E1002 12:58:59.255918 4724 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.233:6443: connect: connection refused" interval="3.2s" Oct 02 12:58:59 crc kubenswrapper[4724]: I1002 12:58:59.346868 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"a675aca2218d07044b6e9a391f2b60f5cc822790011af97b0c0b704c1bd98d78"} Oct 02 12:58:59 crc kubenswrapper[4724]: I1002 12:58:59.346929 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"a087f1c52a561564ac405ef652e8cb1542077e507acff22e1189f7370b7ca322"} Oct 02 12:58:59 crc kubenswrapper[4724]: I1002 12:58:59.347056 4724 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 02 12:58:59 crc kubenswrapper[4724]: I1002 12:58:59.349447 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 12:58:59 crc kubenswrapper[4724]: I1002 12:58:59.349495 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 12:58:59 crc kubenswrapper[4724]: I1002 12:58:59.349506 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 12:58:59 crc kubenswrapper[4724]: I1002 12:58:59.351952 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"25e74bd1660931f7bb8dc824f985c539abcbd4c706a11edf7192876b8796e751"} Oct 02 12:58:59 crc kubenswrapper[4724]: I1002 12:58:59.352035 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"c06801afff9aa03ae9e0bcd8a966f454d81fdfe876806eed22e9d93132989dae"} Oct 02 12:58:59 crc kubenswrapper[4724]: I1002 12:58:59.352051 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"ff4c45bdd946a4d50c7dc93752d3d1a71b10b830611b64e138a5e1c0eb9e5e30"} Oct 02 12:58:59 crc kubenswrapper[4724]: I1002 12:58:59.352065 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"f058a4a1cf35b7f4800f872139322f2b0096d67548945ba03ecfe6775ffd5103"} Oct 02 12:58:59 crc kubenswrapper[4724]: I1002 12:58:59.355293 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"7459703a380feaeff882176dc352b69d3e13089a18d5da98bcebd41018d32091"} Oct 02 12:58:59 crc kubenswrapper[4724]: I1002 12:58:59.355404 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"5d7393e9d92fc40b10b30d788707b77d3c3e28fb075c618da4c3575dc0a1c1a9"} Oct 02 12:58:59 crc kubenswrapper[4724]: I1002 12:58:59.355424 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"4c3fabef61b20063aa2de1008b2b1fda94f39ddd4d0ad905d1a954955ffff7d2"} Oct 02 12:58:59 crc kubenswrapper[4724]: I1002 12:58:59.355330 4724 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 02 12:58:59 crc kubenswrapper[4724]: I1002 12:58:59.359636 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 12:58:59 crc kubenswrapper[4724]: I1002 12:58:59.359683 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 12:58:59 crc kubenswrapper[4724]: I1002 12:58:59.359694 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 12:58:59 crc kubenswrapper[4724]: I1002 12:58:59.364354 4724 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="352a603e387ff155670fe4a2c6683cb0b84e8e31c083e4983d466b33a8b5cfa4" exitCode=0 Oct 02 12:58:59 crc kubenswrapper[4724]: I1002 12:58:59.364498 4724 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 02 12:58:59 crc kubenswrapper[4724]: I1002 12:58:59.364476 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"352a603e387ff155670fe4a2c6683cb0b84e8e31c083e4983d466b33a8b5cfa4"} Oct 02 12:58:59 crc kubenswrapper[4724]: I1002 12:58:59.365524 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 12:58:59 crc kubenswrapper[4724]: I1002 12:58:59.365586 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 12:58:59 crc kubenswrapper[4724]: I1002 12:58:59.365601 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 12:58:59 crc kubenswrapper[4724]: I1002 12:58:59.367423 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"f77fba67120d9373972da96b5e1bffe2fbe5e928e877cac05d409feb94638952"} Oct 02 12:58:59 crc kubenswrapper[4724]: I1002 12:58:59.367484 4724 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 02 12:58:59 crc kubenswrapper[4724]: I1002 12:58:59.368492 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 12:58:59 crc kubenswrapper[4724]: I1002 12:58:59.368571 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 12:58:59 crc kubenswrapper[4724]: I1002 12:58:59.368586 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 12:58:59 crc kubenswrapper[4724]: W1002 12:58:59.369608 4724 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.233:6443: connect: connection refused Oct 02 12:58:59 crc kubenswrapper[4724]: E1002 12:58:59.369915 4724 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.233:6443: connect: connection refused" logger="UnhandledError" Oct 02 12:58:59 crc kubenswrapper[4724]: I1002 12:58:59.487899 4724 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 02 12:58:59 crc kubenswrapper[4724]: I1002 12:58:59.490625 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 12:58:59 crc kubenswrapper[4724]: I1002 12:58:59.490680 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 12:58:59 crc kubenswrapper[4724]: I1002 12:58:59.490694 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 12:58:59 crc kubenswrapper[4724]: I1002 12:58:59.490727 4724 kubelet_node_status.go:76] "Attempting to register node" node="crc" Oct 02 12:58:59 crc kubenswrapper[4724]: E1002 12:58:59.491443 4724 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.233:6443: connect: connection refused" node="crc" Oct 02 12:58:59 crc kubenswrapper[4724]: I1002 12:58:59.570313 4724 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 02 12:58:59 crc kubenswrapper[4724]: W1002 12:58:59.585306 4724 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.233:6443: connect: connection refused Oct 02 12:58:59 crc kubenswrapper[4724]: E1002 12:58:59.585409 4724 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.233:6443: connect: connection refused" logger="UnhandledError" Oct 02 12:58:59 crc kubenswrapper[4724]: W1002 12:58:59.834494 4724 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.233:6443: connect: connection refused Oct 02 12:58:59 crc kubenswrapper[4724]: E1002 12:58:59.834628 4724 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.233:6443: connect: connection refused" logger="UnhandledError" Oct 02 12:58:59 crc kubenswrapper[4724]: W1002 12:58:59.940961 4724 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.233:6443: connect: connection refused Oct 02 12:58:59 crc kubenswrapper[4724]: E1002 12:58:59.941054 4724 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.233:6443: connect: connection refused" logger="UnhandledError" Oct 02 12:59:00 crc kubenswrapper[4724]: I1002 12:59:00.242856 4724 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.233:6443: connect: connection refused Oct 02 12:59:00 crc kubenswrapper[4724]: I1002 12:59:00.372448 4724 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="16abfe64f96f4e7e698dfd025fd2c3072c6e12dacc2b40cdde116117b2c0e589" exitCode=0 Oct 02 12:59:00 crc kubenswrapper[4724]: I1002 12:59:00.372506 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"16abfe64f96f4e7e698dfd025fd2c3072c6e12dacc2b40cdde116117b2c0e589"} Oct 02 12:59:00 crc kubenswrapper[4724]: I1002 12:59:00.372601 4724 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 02 12:59:00 crc kubenswrapper[4724]: I1002 12:59:00.373481 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 12:59:00 crc kubenswrapper[4724]: I1002 12:59:00.373572 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 12:59:00 crc kubenswrapper[4724]: I1002 12:59:00.373586 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 12:59:00 crc kubenswrapper[4724]: I1002 12:59:00.376346 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"6dc26ffb7c05b14a4c279727262da34a779313ae11888d9c01f8d08fbf705031"} Oct 02 12:59:00 crc kubenswrapper[4724]: I1002 12:59:00.376397 4724 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 02 12:59:00 crc kubenswrapper[4724]: I1002 12:59:00.376444 4724 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 02 12:59:00 crc kubenswrapper[4724]: I1002 12:59:00.376455 4724 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 02 12:59:00 crc kubenswrapper[4724]: I1002 12:59:00.376463 4724 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 02 12:59:00 crc kubenswrapper[4724]: I1002 12:59:00.376504 4724 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 02 12:59:00 crc kubenswrapper[4724]: I1002 12:59:00.377734 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 12:59:00 crc kubenswrapper[4724]: I1002 12:59:00.377759 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 12:59:00 crc kubenswrapper[4724]: I1002 12:59:00.377770 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 12:59:00 crc kubenswrapper[4724]: I1002 12:59:00.378295 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 12:59:00 crc kubenswrapper[4724]: I1002 12:59:00.378313 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 12:59:00 crc kubenswrapper[4724]: I1002 12:59:00.378322 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 12:59:00 crc kubenswrapper[4724]: I1002 12:59:00.378437 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 12:59:00 crc kubenswrapper[4724]: I1002 12:59:00.378456 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 12:59:00 crc kubenswrapper[4724]: I1002 12:59:00.378465 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 12:59:00 crc kubenswrapper[4724]: I1002 12:59:00.378550 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 12:59:00 crc kubenswrapper[4724]: I1002 12:59:00.378592 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 12:59:00 crc kubenswrapper[4724]: I1002 12:59:00.378602 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 12:59:01 crc kubenswrapper[4724]: I1002 12:59:01.242902 4724 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.233:6443: connect: connection refused Oct 02 12:59:01 crc kubenswrapper[4724]: I1002 12:59:01.380069 4724 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 02 12:59:01 crc kubenswrapper[4724]: I1002 12:59:01.380078 4724 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 02 12:59:01 crc kubenswrapper[4724]: I1002 12:59:01.380246 4724 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 02 12:59:01 crc kubenswrapper[4724]: I1002 12:59:01.381460 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 12:59:01 crc kubenswrapper[4724]: I1002 12:59:01.381468 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 12:59:01 crc kubenswrapper[4724]: I1002 12:59:01.381614 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 12:59:01 crc kubenswrapper[4724]: I1002 12:59:01.381648 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 12:59:01 crc kubenswrapper[4724]: I1002 12:59:01.381512 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 12:59:01 crc kubenswrapper[4724]: I1002 12:59:01.381981 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 12:59:02 crc kubenswrapper[4724]: I1002 12:59:02.242235 4724 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.233:6443: connect: connection refused Oct 02 12:59:02 crc kubenswrapper[4724]: I1002 12:59:02.384766 4724 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Oct 02 12:59:02 crc kubenswrapper[4724]: I1002 12:59:02.387686 4724 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="6dc26ffb7c05b14a4c279727262da34a779313ae11888d9c01f8d08fbf705031" exitCode=255 Oct 02 12:59:02 crc kubenswrapper[4724]: I1002 12:59:02.387761 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"6dc26ffb7c05b14a4c279727262da34a779313ae11888d9c01f8d08fbf705031"} Oct 02 12:59:02 crc kubenswrapper[4724]: I1002 12:59:02.387942 4724 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 02 12:59:02 crc kubenswrapper[4724]: I1002 12:59:02.389358 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 12:59:02 crc kubenswrapper[4724]: I1002 12:59:02.389387 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 12:59:02 crc kubenswrapper[4724]: I1002 12:59:02.389397 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 12:59:02 crc kubenswrapper[4724]: I1002 12:59:02.390017 4724 scope.go:117] "RemoveContainer" containerID="6dc26ffb7c05b14a4c279727262da34a779313ae11888d9c01f8d08fbf705031" Oct 02 12:59:02 crc kubenswrapper[4724]: I1002 12:59:02.391293 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"c4dcaf9b13a51b56550d0d225ced71a111f0761297b1d8389ff277e6900e07d2"} Oct 02 12:59:02 crc kubenswrapper[4724]: I1002 12:59:02.423805 4724 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 02 12:59:02 crc kubenswrapper[4724]: I1002 12:59:02.424066 4724 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 02 12:59:02 crc kubenswrapper[4724]: I1002 12:59:02.426106 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 12:59:02 crc kubenswrapper[4724]: I1002 12:59:02.426154 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 12:59:02 crc kubenswrapper[4724]: I1002 12:59:02.426163 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 12:59:02 crc kubenswrapper[4724]: I1002 12:59:02.432626 4724 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 02 12:59:02 crc kubenswrapper[4724]: E1002 12:59:02.457814 4724 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.233:6443: connect: connection refused" interval="6.4s" Oct 02 12:59:02 crc kubenswrapper[4724]: I1002 12:59:02.692324 4724 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 02 12:59:02 crc kubenswrapper[4724]: I1002 12:59:02.694311 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 12:59:02 crc kubenswrapper[4724]: I1002 12:59:02.694359 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 12:59:02 crc kubenswrapper[4724]: I1002 12:59:02.694372 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 12:59:02 crc kubenswrapper[4724]: I1002 12:59:02.694403 4724 kubelet_node_status.go:76] "Attempting to register node" node="crc" Oct 02 12:59:02 crc kubenswrapper[4724]: E1002 12:59:02.695095 4724 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.233:6443: connect: connection refused" node="crc" Oct 02 12:59:03 crc kubenswrapper[4724]: I1002 12:59:03.025793 4724 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 02 12:59:03 crc kubenswrapper[4724]: I1002 12:59:03.396957 4724 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Oct 02 12:59:03 crc kubenswrapper[4724]: I1002 12:59:03.398726 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"8a3585f00b9d5a19821986d13b0d46f0db3bdd6eef3b4e28f3a63449e0aa0e55"} Oct 02 12:59:03 crc kubenswrapper[4724]: I1002 12:59:03.398894 4724 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 02 12:59:03 crc kubenswrapper[4724]: I1002 12:59:03.398981 4724 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 02 12:59:03 crc kubenswrapper[4724]: I1002 12:59:03.400304 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 12:59:03 crc kubenswrapper[4724]: I1002 12:59:03.400366 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 12:59:03 crc kubenswrapper[4724]: I1002 12:59:03.400391 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 12:59:03 crc kubenswrapper[4724]: I1002 12:59:03.403773 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"0655790610f1bfc9af42334228a39b7d8830cd43b0d34b181d282f9e1b9285e2"} Oct 02 12:59:03 crc kubenswrapper[4724]: I1002 12:59:03.403828 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"640e93a6ac17bb56e02d29e12855bfdd6f1950ecd1cdbbf45817af206d9094b6"} Oct 02 12:59:03 crc kubenswrapper[4724]: I1002 12:59:03.403860 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"f1a1e9b765d86128893c884d523023d510c6785d1c8d9050f717308fc1b7c3a1"} Oct 02 12:59:03 crc kubenswrapper[4724]: I1002 12:59:03.403898 4724 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 02 12:59:03 crc kubenswrapper[4724]: I1002 12:59:03.405417 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 12:59:03 crc kubenswrapper[4724]: I1002 12:59:03.405471 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 12:59:03 crc kubenswrapper[4724]: I1002 12:59:03.405495 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 12:59:03 crc kubenswrapper[4724]: I1002 12:59:03.790868 4724 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 02 12:59:04 crc kubenswrapper[4724]: I1002 12:59:04.413447 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"3fb1a3b730285a2f0540429adf68f8b0659131366ac0360d9f3c1d3f17d3980b"} Oct 02 12:59:04 crc kubenswrapper[4724]: I1002 12:59:04.413485 4724 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 02 12:59:04 crc kubenswrapper[4724]: I1002 12:59:04.413601 4724 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 02 12:59:04 crc kubenswrapper[4724]: I1002 12:59:04.413553 4724 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 02 12:59:04 crc kubenswrapper[4724]: I1002 12:59:04.413764 4724 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 02 12:59:04 crc kubenswrapper[4724]: I1002 12:59:04.415150 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 12:59:04 crc kubenswrapper[4724]: I1002 12:59:04.415177 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 12:59:04 crc kubenswrapper[4724]: I1002 12:59:04.415186 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 12:59:04 crc kubenswrapper[4724]: I1002 12:59:04.415233 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 12:59:04 crc kubenswrapper[4724]: I1002 12:59:04.415246 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 12:59:04 crc kubenswrapper[4724]: I1002 12:59:04.415268 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 12:59:04 crc kubenswrapper[4724]: I1002 12:59:04.415275 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 12:59:04 crc kubenswrapper[4724]: I1002 12:59:04.415283 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 12:59:04 crc kubenswrapper[4724]: I1002 12:59:04.415288 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 12:59:04 crc kubenswrapper[4724]: I1002 12:59:04.478596 4724 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-etcd/etcd-crc" Oct 02 12:59:04 crc kubenswrapper[4724]: I1002 12:59:04.481794 4724 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 02 12:59:05 crc kubenswrapper[4724]: I1002 12:59:05.055954 4724 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-etcd/etcd-crc" Oct 02 12:59:05 crc kubenswrapper[4724]: I1002 12:59:05.416201 4724 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 02 12:59:05 crc kubenswrapper[4724]: I1002 12:59:05.416272 4724 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 02 12:59:05 crc kubenswrapper[4724]: I1002 12:59:05.416358 4724 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 02 12:59:05 crc kubenswrapper[4724]: I1002 12:59:05.417312 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 12:59:05 crc kubenswrapper[4724]: I1002 12:59:05.417348 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 12:59:05 crc kubenswrapper[4724]: I1002 12:59:05.417361 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 12:59:05 crc kubenswrapper[4724]: I1002 12:59:05.417655 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 12:59:05 crc kubenswrapper[4724]: I1002 12:59:05.417694 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 12:59:05 crc kubenswrapper[4724]: I1002 12:59:05.417706 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 12:59:05 crc kubenswrapper[4724]: I1002 12:59:05.925643 4724 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 02 12:59:06 crc kubenswrapper[4724]: I1002 12:59:06.026548 4724 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Oct 02 12:59:06 crc kubenswrapper[4724]: I1002 12:59:06.026670 4724 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Oct 02 12:59:06 crc kubenswrapper[4724]: E1002 12:59:06.382362 4724 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Oct 02 12:59:06 crc kubenswrapper[4724]: I1002 12:59:06.419763 4724 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 02 12:59:06 crc kubenswrapper[4724]: I1002 12:59:06.419763 4724 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 02 12:59:06 crc kubenswrapper[4724]: I1002 12:59:06.421054 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 12:59:06 crc kubenswrapper[4724]: I1002 12:59:06.421100 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 12:59:06 crc kubenswrapper[4724]: I1002 12:59:06.421111 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 12:59:06 crc kubenswrapper[4724]: I1002 12:59:06.421181 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 12:59:06 crc kubenswrapper[4724]: I1002 12:59:06.421238 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 12:59:06 crc kubenswrapper[4724]: I1002 12:59:06.421254 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 12:59:06 crc kubenswrapper[4724]: I1002 12:59:06.617466 4724 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 02 12:59:06 crc kubenswrapper[4724]: I1002 12:59:06.617796 4724 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 02 12:59:06 crc kubenswrapper[4724]: I1002 12:59:06.619352 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 12:59:06 crc kubenswrapper[4724]: I1002 12:59:06.619415 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 12:59:06 crc kubenswrapper[4724]: I1002 12:59:06.619431 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 12:59:07 crc kubenswrapper[4724]: I1002 12:59:07.804308 4724 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 02 12:59:07 crc kubenswrapper[4724]: I1002 12:59:07.804518 4724 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 02 12:59:07 crc kubenswrapper[4724]: I1002 12:59:07.805952 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 12:59:07 crc kubenswrapper[4724]: I1002 12:59:07.806040 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 12:59:07 crc kubenswrapper[4724]: I1002 12:59:07.806076 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 12:59:07 crc kubenswrapper[4724]: I1002 12:59:07.809910 4724 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 02 12:59:08 crc kubenswrapper[4724]: I1002 12:59:08.425565 4724 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 02 12:59:08 crc kubenswrapper[4724]: I1002 12:59:08.430179 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 12:59:08 crc kubenswrapper[4724]: I1002 12:59:08.430243 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 12:59:08 crc kubenswrapper[4724]: I1002 12:59:08.430257 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 12:59:09 crc kubenswrapper[4724]: I1002 12:59:09.095906 4724 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 02 12:59:09 crc kubenswrapper[4724]: I1002 12:59:09.098260 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 12:59:09 crc kubenswrapper[4724]: I1002 12:59:09.098320 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 12:59:09 crc kubenswrapper[4724]: I1002 12:59:09.098334 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 12:59:09 crc kubenswrapper[4724]: I1002 12:59:09.098371 4724 kubelet_node_status.go:76] "Attempting to register node" node="crc" Oct 02 12:59:13 crc kubenswrapper[4724]: I1002 12:59:13.242741 4724 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": net/http: TLS handshake timeout Oct 02 12:59:13 crc kubenswrapper[4724]: I1002 12:59:13.791658 4724 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="Get \"https://192.168.126.11:6443/livez\": context deadline exceeded" start-of-body= Oct 02 12:59:13 crc kubenswrapper[4724]: I1002 12:59:13.792028 4724 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="Get \"https://192.168.126.11:6443/livez\": context deadline exceeded" Oct 02 12:59:14 crc kubenswrapper[4724]: I1002 12:59:14.121316 4724 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Oct 02 12:59:14 crc kubenswrapper[4724]: I1002 12:59:14.121515 4724 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Oct 02 12:59:14 crc kubenswrapper[4724]: I1002 12:59:14.516930 4724 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-etcd/etcd-crc" Oct 02 12:59:14 crc kubenswrapper[4724]: I1002 12:59:14.517285 4724 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 02 12:59:14 crc kubenswrapper[4724]: I1002 12:59:14.519033 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 12:59:14 crc kubenswrapper[4724]: I1002 12:59:14.519099 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 12:59:14 crc kubenswrapper[4724]: I1002 12:59:14.519112 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 12:59:14 crc kubenswrapper[4724]: I1002 12:59:14.536695 4724 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-etcd/etcd-crc" Oct 02 12:59:15 crc kubenswrapper[4724]: I1002 12:59:15.444252 4724 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 02 12:59:15 crc kubenswrapper[4724]: I1002 12:59:15.445201 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 12:59:15 crc kubenswrapper[4724]: I1002 12:59:15.445274 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 12:59:15 crc kubenswrapper[4724]: I1002 12:59:15.445284 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 12:59:16 crc kubenswrapper[4724]: I1002 12:59:16.026793 4724 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Oct 02 12:59:16 crc kubenswrapper[4724]: I1002 12:59:16.026928 4724 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Oct 02 12:59:16 crc kubenswrapper[4724]: E1002 12:59:16.382648 4724 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Oct 02 12:59:18 crc kubenswrapper[4724]: I1002 12:59:18.797874 4724 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 02 12:59:18 crc kubenswrapper[4724]: I1002 12:59:18.798046 4724 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 02 12:59:18 crc kubenswrapper[4724]: I1002 12:59:18.799226 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 12:59:18 crc kubenswrapper[4724]: I1002 12:59:18.799284 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 12:59:18 crc kubenswrapper[4724]: I1002 12:59:18.799308 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 12:59:18 crc kubenswrapper[4724]: I1002 12:59:18.802453 4724 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 02 12:59:19 crc kubenswrapper[4724]: E1002 12:59:19.107401 4724 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": context deadline exceeded" interval="7s" Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.108861 4724 trace.go:236] Trace[1893670514]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (02-Oct-2025 12:59:05.472) (total time: 13635ms): Oct 02 12:59:19 crc kubenswrapper[4724]: Trace[1893670514]: ---"Objects listed" error: 13635ms (12:59:19.108) Oct 02 12:59:19 crc kubenswrapper[4724]: Trace[1893670514]: [13.635934121s] [13.635934121s] END Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.108894 4724 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.109806 4724 trace.go:236] Trace[1632143051]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (02-Oct-2025 12:59:05.745) (total time: 13364ms): Oct 02 12:59:19 crc kubenswrapper[4724]: Trace[1632143051]: ---"Objects listed" error: 13364ms (12:59:19.109) Oct 02 12:59:19 crc kubenswrapper[4724]: Trace[1632143051]: [13.364444686s] [13.364444686s] END Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.109830 4724 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.109911 4724 reconstruct.go:205] "DevicePaths of reconstructed volumes updated" Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.109916 4724 trace.go:236] Trace[2113044620]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (02-Oct-2025 12:59:04.540) (total time: 14569ms): Oct 02 12:59:19 crc kubenswrapper[4724]: Trace[2113044620]: ---"Objects listed" error: 14569ms (12:59:19.109) Oct 02 12:59:19 crc kubenswrapper[4724]: Trace[2113044620]: [14.569190274s] [14.569190274s] END Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.109937 4724 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.111089 4724 trace.go:236] Trace[1935049778]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (02-Oct-2025 12:59:04.487) (total time: 14623ms): Oct 02 12:59:19 crc kubenswrapper[4724]: Trace[1935049778]: ---"Objects listed" error: 14623ms (12:59:19.110) Oct 02 12:59:19 crc kubenswrapper[4724]: Trace[1935049778]: [14.623124474s] [14.623124474s] END Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.111126 4724 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Oct 02 12:59:19 crc kubenswrapper[4724]: E1002 12:59:19.112441 4724 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes \"crc\" is forbidden: autoscaling.openshift.io/ManagedNode infra config cache not synchronized" node="crc" Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.242133 4724 apiserver.go:52] "Watching apiserver" Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.249371 4724 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.249715 4724 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-network-diagnostics/network-check-source-55646444c4-trplf","openshift-network-diagnostics/network-check-target-xd92c","openshift-network-node-identity/network-node-identity-vrzqb","openshift-network-operator/iptables-alerter-4ln5h","openshift-network-operator/network-operator-58b4c7f79c-55gtf","openshift-network-console/networking-console-plugin-85b44fc459-gdk6g"] Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.250214 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.250323 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.250419 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.250632 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.250665 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 12:59:19 crc kubenswrapper[4724]: E1002 12:59:19.250898 4724 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.250939 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 02 12:59:19 crc kubenswrapper[4724]: E1002 12:59:19.251271 4724 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 02 12:59:19 crc kubenswrapper[4724]: E1002 12:59:19.251338 4724 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.252097 4724 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.252929 4724 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.253070 4724 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.253229 4724 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.253653 4724 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.253747 4724 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.253892 4724 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.254435 4724 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.254835 4724 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.255219 4724 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.302738 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.310800 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.310850 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.310872 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.310890 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.310909 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.310927 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.310944 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.310962 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.311050 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.311067 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.311086 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.311130 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.311156 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.311189 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.311215 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.311246 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.311276 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.311305 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.311325 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.311347 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.311333 4724 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" (OuterVolumeSpecName: "kube-api-access-w7l8j") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "kube-api-access-w7l8j". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.311365 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.311466 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") pod \"49ef4625-1d3a-4a9f-b595-c2433d32326d\" (UID: \"49ef4625-1d3a-4a9f-b595-c2433d32326d\") " Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.311496 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.311518 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.311560 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.311579 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.311595 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.311612 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.311634 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.311651 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.311670 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.311692 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.311707 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.311723 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.311742 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.311764 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") pod \"44663579-783b-4372-86d6-acf235a62d72\" (UID: \"44663579-783b-4372-86d6-acf235a62d72\") " Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.311786 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.311802 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.311822 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.311838 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.311858 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.311875 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.311891 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.311934 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.311952 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.311969 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.311961 4724 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" (OuterVolumeSpecName: "config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.311974 4724 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" (OuterVolumeSpecName: "service-ca") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.312040 4724 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.312043 4724 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.312201 4724 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.312213 4724 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" (OuterVolumeSpecName: "webhook-certs") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "webhook-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.311987 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.312241 4724 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" (OuterVolumeSpecName: "kube-api-access-mnrrd") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "kube-api-access-mnrrd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.312276 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.312306 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.312314 4724 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.312329 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.312351 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.312373 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.312393 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.312453 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.312471 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.312489 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.312507 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.312523 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.312556 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.312635 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.312653 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.313019 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.313047 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.313074 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.313098 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.313129 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.313156 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.313179 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.313204 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.313226 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.313247 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.313271 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.313293 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.313317 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.312326 4724 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.312368 4724 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" (OuterVolumeSpecName: "kube-api-access-pjr6v") pod "49ef4625-1d3a-4a9f-b595-c2433d32326d" (UID: "49ef4625-1d3a-4a9f-b595-c2433d32326d"). InnerVolumeSpecName "kube-api-access-pjr6v". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.312622 4724 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" (OuterVolumeSpecName: "cert") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.312718 4724 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.313805 4724 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.312840 4724 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.312877 4724 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.312932 4724 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.313051 4724 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.313066 4724 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.313236 4724 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.313313 4724 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.313318 4724 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.313379 4724 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" (OuterVolumeSpecName: "kube-api-access-jkwtn") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "kube-api-access-jkwtn". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.313430 4724 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.313570 4724 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" (OuterVolumeSpecName: "webhook-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "webhook-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.313687 4724 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.313755 4724 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.313952 4724 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" (OuterVolumeSpecName: "kube-api-access-lz9wn") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "kube-api-access-lz9wn". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.313798 4724 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" (OuterVolumeSpecName: "utilities") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.313845 4724 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" (OuterVolumeSpecName: "kube-api-access-rnphk") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "kube-api-access-rnphk". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.314037 4724 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" (OuterVolumeSpecName: "service-ca") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.314075 4724 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.314062 4724 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" (OuterVolumeSpecName: "serviceca") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "serviceca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.314138 4724 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.314334 4724 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.314401 4724 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" (OuterVolumeSpecName: "kube-api-access-cfbct") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "kube-api-access-cfbct". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.314451 4724 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" (OuterVolumeSpecName: "kube-api-access-bf2bz") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "kube-api-access-bf2bz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.314461 4724 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" (OuterVolumeSpecName: "machine-api-operator-tls") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "machine-api-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.314498 4724 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.314662 4724 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" (OuterVolumeSpecName: "kube-api-access-vt5rc") pod "44663579-783b-4372-86d6-acf235a62d72" (UID: "44663579-783b-4372-86d6-acf235a62d72"). InnerVolumeSpecName "kube-api-access-vt5rc". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.314705 4724 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.314753 4724 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.314879 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.314931 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.314915 4724 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" (OuterVolumeSpecName: "utilities") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.314972 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.315004 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.315023 4724 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" (OuterVolumeSpecName: "kube-api-access-s4n52") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "kube-api-access-s4n52". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.315060 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.315099 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.315123 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.315164 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.315194 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.315224 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.315259 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.315281 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.315309 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.315332 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.315360 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.315411 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.315436 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.315461 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.315469 4724 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.315486 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.315497 4724 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" (OuterVolumeSpecName: "kube-api-access-6g6sz") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "kube-api-access-6g6sz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.315510 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.315555 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.315653 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.315680 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.315704 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.315731 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.315752 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.315777 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.315820 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.315871 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.315895 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.315948 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.315969 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.315995 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.316018 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.316051 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.316077 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.316100 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.316129 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.316154 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.316183 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.316208 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.316235 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.316257 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.316255 4724 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" (OuterVolumeSpecName: "kube-api-access-279lb") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "kube-api-access-279lb". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.316278 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.316304 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.316326 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.316352 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.316372 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.316394 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.316417 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.316415 4724 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.316441 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.316466 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.316473 4724 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" (OuterVolumeSpecName: "kube-api-access-8tdtz") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "kube-api-access-8tdtz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.316491 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.316518 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.316563 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") pod \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\" (UID: \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\") " Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.316589 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.316616 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.316640 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.316659 4724 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.316664 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.316738 4724 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.316747 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.316803 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.316829 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.316864 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.316904 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.316927 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.316945 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.316950 4724 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" (OuterVolumeSpecName: "kube-api-access-gf66m") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "kube-api-access-gf66m". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.316971 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.317001 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.317030 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.317112 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.317127 4724 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" (OuterVolumeSpecName: "audit") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "audit". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.317146 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.317170 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.317196 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.317225 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.317429 4724 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.317459 4724 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" (OuterVolumeSpecName: "kube-api-access-dbsvg") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "kube-api-access-dbsvg". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.317471 4724 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" (OuterVolumeSpecName: "multus-daemon-config") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "multus-daemon-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.317833 4724 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" (OuterVolumeSpecName: "image-import-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "image-import-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.317900 4724 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" (OuterVolumeSpecName: "machine-approver-tls") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "machine-approver-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.318009 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.318039 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.318058 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.318140 4724 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" (OuterVolumeSpecName: "etcd-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.318319 4724 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" (OuterVolumeSpecName: "config") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.318796 4724 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" (OuterVolumeSpecName: "client-ca") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.319104 4724 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.319314 4724 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.319527 4724 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" (OuterVolumeSpecName: "utilities") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.319565 4724 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" (OuterVolumeSpecName: "kube-api-access-9xfj7") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "kube-api-access-9xfj7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.319636 4724 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.319755 4724 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" (OuterVolumeSpecName: "kube-api-access-249nr") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "kube-api-access-249nr". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.319946 4724 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" (OuterVolumeSpecName: "etcd-service-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.320023 4724 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" (OuterVolumeSpecName: "config") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.320143 4724 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" (OuterVolumeSpecName: "apiservice-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "apiservice-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.320506 4724 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.318912 4724 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" (OuterVolumeSpecName: "utilities") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.320818 4724 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" (OuterVolumeSpecName: "kube-api-access-7c4vf") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "kube-api-access-7c4vf". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 12:59:19 crc kubenswrapper[4724]: E1002 12:59:19.321220 4724 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 12:59:19.821183591 +0000 UTC m=+24.275942832 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.322111 4724 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.322164 4724 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" (OuterVolumeSpecName: "kube-api-access-xcgwh") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "kube-api-access-xcgwh". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.322197 4724 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" (OuterVolumeSpecName: "mcc-auth-proxy-config") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "mcc-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.322579 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.322555 4724 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" (OuterVolumeSpecName: "kube-api-access-w9rds") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "kube-api-access-w9rds". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.322654 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.322693 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.322723 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.322755 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.322782 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.322812 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.322850 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") pod \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\" (UID: \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\") " Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.322878 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.323024 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.323058 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.323370 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.323413 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.323444 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.323474 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.323475 4724 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" (OuterVolumeSpecName: "images") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.323505 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.323558 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.323588 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.323617 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.323648 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.323678 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.323706 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.323739 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.323765 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.323792 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.323823 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.323852 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.323882 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.323950 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.323977 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.324001 4724 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.324095 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.324125 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.324505 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.324566 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.324621 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.324687 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.324719 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.324744 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.324771 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.324795 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.324845 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.324874 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.324902 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.324928 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.324996 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.325082 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.325115 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.325153 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.325189 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.325205 4724 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.325293 4724 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.325479 4724 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.325217 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.325601 4724 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" (OuterVolumeSpecName: "kube-api-access-w4xd4") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "kube-api-access-w4xd4". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.325877 4724 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" (OuterVolumeSpecName: "kube-api-access-pcxfs") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "kube-api-access-pcxfs". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.326024 4724 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" (OuterVolumeSpecName: "kube-api-access-x4zgh") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "kube-api-access-x4zgh". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.326216 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.326231 4724 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.326378 4724 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" (OuterVolumeSpecName: "node-bootstrap-token") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "node-bootstrap-token". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.326397 4724 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" (OuterVolumeSpecName: "signing-cabundle") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-cabundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.327153 4724 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.327769 4724 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" (OuterVolumeSpecName: "kube-api-access-v47cf") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "kube-api-access-v47cf". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.327780 4724 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" (OuterVolumeSpecName: "stats-auth") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "stats-auth". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.323437 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:19Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.327979 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.328174 4724 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" (OuterVolumeSpecName: "control-plane-machine-set-operator-tls") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "control-plane-machine-set-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.328294 4724 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.328351 4724 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.328793 4724 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" (OuterVolumeSpecName: "kube-api-access-x2m85") pod "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" (UID: "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d"). InnerVolumeSpecName "kube-api-access-x2m85". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.328819 4724 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" (OuterVolumeSpecName: "kube-api-access-fqsjt") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "kube-api-access-fqsjt". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.328436 4724 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.321215 4724 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.329875 4724 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" (OuterVolumeSpecName: "kube-api-access-2w9zh") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "kube-api-access-2w9zh". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.329910 4724 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.330035 4724 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.330063 4724 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.330138 4724 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" (OuterVolumeSpecName: "config") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.330485 4724 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.329162 4724 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.329261 4724 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" (OuterVolumeSpecName: "kube-api-access-d6qdx") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "kube-api-access-d6qdx". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.329699 4724 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.330671 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.329743 4724 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.330775 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.330820 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.330841 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.330897 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.330921 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 02 12:59:19 crc kubenswrapper[4724]: E1002 12:59:19.331061 4724 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.331106 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.331605 4724 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.331982 4724 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" (OuterVolumeSpecName: "kube-api-access-wxkg8") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "kube-api-access-wxkg8". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.332045 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 02 12:59:19 crc kubenswrapper[4724]: E1002 12:59:19.332074 4724 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-02 12:59:19.832049814 +0000 UTC m=+24.286808945 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.333078 4724 reconciler_common.go:293] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") on node \"crc\" DevicePath \"\"" Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.333114 4724 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") on node \"crc\" DevicePath \"\"" Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.333129 4724 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.333143 4724 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.333154 4724 reconciler_common.go:293] "Volume detached for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") on node \"crc\" DevicePath \"\"" Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.333166 4724 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.333178 4724 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") on node \"crc\" DevicePath \"\"" Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.333191 4724 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") on node \"crc\" DevicePath \"\"" Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.333203 4724 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") on node \"crc\" DevicePath \"\"" Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.333213 4724 reconciler_common.go:293] "Volume detached for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") on node \"crc\" DevicePath \"\"" Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.333224 4724 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.333236 4724 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.333248 4724 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") on node \"crc\" DevicePath \"\"" Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.333260 4724 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") on node \"crc\" DevicePath \"\"" Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.333271 4724 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") on node \"crc\" DevicePath \"\"" Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.333281 4724 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.333290 4724 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") on node \"crc\" DevicePath \"\"" Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.333326 4724 reconciler_common.go:293] "Volume detached for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") on node \"crc\" DevicePath \"\"" Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.333338 4724 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.333351 4724 reconciler_common.go:293] "Volume detached for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") on node \"crc\" DevicePath \"\"" Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.333388 4724 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") on node \"crc\" DevicePath \"\"" Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.333402 4724 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") on node \"crc\" DevicePath \"\"" Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.333412 4724 reconciler_common.go:293] "Volume detached for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") on node \"crc\" DevicePath \"\"" Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.333422 4724 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.333432 4724 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") on node \"crc\" DevicePath \"\"" Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.333442 4724 reconciler_common.go:293] "Volume detached for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") on node \"crc\" DevicePath \"\"" Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.333453 4724 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") on node \"crc\" DevicePath \"\"" Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.333464 4724 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") on node \"crc\" DevicePath \"\"" Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.333474 4724 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") on node \"crc\" DevicePath \"\"" Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.333484 4724 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") on node \"crc\" DevicePath \"\"" Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.333495 4724 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") on node \"crc\" DevicePath \"\"" Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.333505 4724 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.333517 4724 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") on node \"crc\" DevicePath \"\"" Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.333528 4724 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") on node \"crc\" DevicePath \"\"" Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.333557 4724 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.333571 4724 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") on node \"crc\" DevicePath \"\"" Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.333582 4724 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") on node \"crc\" DevicePath \"\"" Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.333591 4724 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.333601 4724 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") on node \"crc\" DevicePath \"\"" Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.333610 4724 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") on node \"crc\" DevicePath \"\"" Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.333621 4724 reconciler_common.go:293] "Volume detached for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.333632 4724 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") on node \"crc\" DevicePath \"\"" Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.333642 4724 reconciler_common.go:293] "Volume detached for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") on node \"crc\" DevicePath \"\"" Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.333652 4724 reconciler_common.go:293] "Volume detached for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") on node \"crc\" DevicePath \"\"" Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.333662 4724 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.333675 4724 reconciler_common.go:293] "Volume detached for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") on node \"crc\" DevicePath \"\"" Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.333683 4724 reconciler_common.go:293] "Volume detached for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") on node \"crc\" DevicePath \"\"" Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.333694 4724 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") on node \"crc\" DevicePath \"\"" Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.333704 4724 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.333715 4724 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.333723 4724 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") on node \"crc\" DevicePath \"\"" Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.333732 4724 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") on node \"crc\" DevicePath \"\"" Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.333741 4724 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") on node \"crc\" DevicePath \"\"" Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.333750 4724 reconciler_common.go:293] "Volume detached for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") on node \"crc\" DevicePath \"\"" Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.333759 4724 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.333769 4724 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") on node \"crc\" DevicePath \"\"" Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.333778 4724 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") on node \"crc\" DevicePath \"\"" Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.333786 4724 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") on node \"crc\" DevicePath \"\"" Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.333795 4724 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") on node \"crc\" DevicePath \"\"" Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.333804 4724 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") on node \"crc\" DevicePath \"\"" Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.333813 4724 reconciler_common.go:293] "Volume detached for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") on node \"crc\" DevicePath \"\"" Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.333822 4724 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.333831 4724 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.333839 4724 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.333848 4724 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.333858 4724 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") on node \"crc\" DevicePath \"\"" Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.333866 4724 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") on node \"crc\" DevicePath \"\"" Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.333875 4724 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.333884 4724 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") on node \"crc\" DevicePath \"\"" Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.333932 4724 reconciler_common.go:293] "Volume detached for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") on node \"crc\" DevicePath \"\"" Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.333950 4724 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") on node \"crc\" DevicePath \"\"" Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.333964 4724 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") on node \"crc\" DevicePath \"\"" Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.333975 4724 reconciler_common.go:293] "Volume detached for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") on node \"crc\" DevicePath \"\"" Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.333984 4724 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.333993 4724 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") on node \"crc\" DevicePath \"\"" Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.334001 4724 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.334011 4724 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") on node \"crc\" DevicePath \"\"" Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.334022 4724 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") on node \"crc\" DevicePath \"\"" Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.334031 4724 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") on node \"crc\" DevicePath \"\"" Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.334040 4724 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") on node \"crc\" DevicePath \"\"" Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.334049 4724 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") on node \"crc\" DevicePath \"\"" Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.334058 4724 reconciler_common.go:293] "Volume detached for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") on node \"crc\" DevicePath \"\"" Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.334068 4724 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") on node \"crc\" DevicePath \"\"" Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.334080 4724 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") on node \"crc\" DevicePath \"\"" Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.334089 4724 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") on node \"crc\" DevicePath \"\"" Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.334098 4724 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") on node \"crc\" DevicePath \"\"" Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.334107 4724 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") on node \"crc\" DevicePath \"\"" Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.334119 4724 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.334139 4724 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") on node \"crc\" DevicePath \"\"" Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.334151 4724 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") on node \"crc\" DevicePath \"\"" Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.334163 4724 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") on node \"crc\" DevicePath \"\"" Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.334175 4724 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.334187 4724 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.334446 4724 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.334426 4724 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" (OuterVolumeSpecName: "kube-api-access-xcphl") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "kube-api-access-xcphl". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.334484 4724 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.334756 4724 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" (OuterVolumeSpecName: "kube-api-access-ngvvp") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "kube-api-access-ngvvp". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.335014 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.335571 4724 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.335603 4724 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" (OuterVolumeSpecName: "config") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.335830 4724 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" (OuterVolumeSpecName: "config-volume") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.335973 4724 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.336097 4724 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.336895 4724 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.336905 4724 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.338026 4724 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" (OuterVolumeSpecName: "config") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.338312 4724 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.339000 4724 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" (OuterVolumeSpecName: "mcd-auth-proxy-config") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "mcd-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.339505 4724 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" (OuterVolumeSpecName: "package-server-manager-serving-cert") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "package-server-manager-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.339836 4724 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" (OuterVolumeSpecName: "certs") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.340613 4724 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" (OuterVolumeSpecName: "config") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.340661 4724 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" (OuterVolumeSpecName: "config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.341564 4724 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" (OuterVolumeSpecName: "kube-api-access-fcqwp") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "kube-api-access-fcqwp". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.341642 4724 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.341925 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 02 12:59:19 crc kubenswrapper[4724]: E1002 12:59:19.342045 4724 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 02 12:59:19 crc kubenswrapper[4724]: E1002 12:59:19.342071 4724 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 02 12:59:19 crc kubenswrapper[4724]: E1002 12:59:19.342090 4724 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.342060 4724 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 12:59:19 crc kubenswrapper[4724]: E1002 12:59:19.342164 4724 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-10-02 12:59:19.842144879 +0000 UTC m=+24.296904000 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 02 12:59:19 crc kubenswrapper[4724]: E1002 12:59:19.342220 4724 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 02 12:59:19 crc kubenswrapper[4724]: E1002 12:59:19.342283 4724 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-02 12:59:19.842268512 +0000 UTC m=+24.297027843 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.342083 4724 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" (OuterVolumeSpecName: "ovn-control-plane-metrics-cert") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovn-control-plane-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.344394 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.351078 4724 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Liveness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": read tcp 192.168.126.11:52286->192.168.126.11:17697: read: connection reset by peer" start-of-body= Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.351201 4724 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": read tcp 192.168.126.11:52286->192.168.126.11:17697: read: connection reset by peer" Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.355611 4724 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Readiness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": read tcp 192.168.126.11:52282->192.168.126.11:17697: read: connection reset by peer" start-of-body= Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.355708 4724 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": read tcp 192.168.126.11:52282->192.168.126.11:17697: read: connection reset by peer" Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.355743 4724 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" (OuterVolumeSpecName: "kube-api-access-4d4hj") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "kube-api-access-4d4hj". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.355931 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.356033 4724 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.356044 4724 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" (OuterVolumeSpecName: "config") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 12:59:19 crc kubenswrapper[4724]: E1002 12:59:19.356760 4724 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 02 12:59:19 crc kubenswrapper[4724]: E1002 12:59:19.356795 4724 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 02 12:59:19 crc kubenswrapper[4724]: E1002 12:59:19.356817 4724 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 02 12:59:19 crc kubenswrapper[4724]: E1002 12:59:19.356910 4724 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-10-02 12:59:19.85687919 +0000 UTC m=+24.311638311 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.357114 4724 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.357162 4724 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" (OuterVolumeSpecName: "kube-api-access-2d4wz") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "kube-api-access-2d4wz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.357210 4724 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Readiness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" start-of-body= Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.357233 4724 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.357367 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:19Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.357497 4724 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" (OuterVolumeSpecName: "signing-key") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.358005 4724 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.358570 4724 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.358077 4724 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.358929 4724 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.358978 4724 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" (OuterVolumeSpecName: "config") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.359566 4724 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" (OuterVolumeSpecName: "kube-api-access-qs4fp") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "kube-api-access-qs4fp". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.360355 4724 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" (OuterVolumeSpecName: "available-featuregates") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "available-featuregates". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.359976 4724 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" (OuterVolumeSpecName: "config") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.360734 4724 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" (OuterVolumeSpecName: "kube-api-access-tk88c") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "kube-api-access-tk88c". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.360885 4724 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.360939 4724 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" (OuterVolumeSpecName: "kube-api-access-sb6h7") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "kube-api-access-sb6h7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.361495 4724 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.361865 4724 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.361947 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.361869 4724 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" (OuterVolumeSpecName: "default-certificate") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "default-certificate". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.362085 4724 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" (OuterVolumeSpecName: "tmpfs") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "tmpfs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.362743 4724 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.363049 4724 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" (OuterVolumeSpecName: "samples-operator-tls") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "samples-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.363729 4724 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.363739 4724 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" (OuterVolumeSpecName: "cni-sysctl-allowlist") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-sysctl-allowlist". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.363882 4724 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.363973 4724 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.364360 4724 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" (OuterVolumeSpecName: "kube-api-access-pj782") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "kube-api-access-pj782". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.364420 4724 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.364605 4724 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" (OuterVolumeSpecName: "kube-api-access-d4lsv") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "kube-api-access-d4lsv". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.364769 4724 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.364813 4724 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.365366 4724 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" (OuterVolumeSpecName: "kube-api-access-mg5zb") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "kube-api-access-mg5zb". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.367788 4724 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.367805 4724 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.367850 4724 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.368802 4724 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" (OuterVolumeSpecName: "kube-api-access-jhbk2") pod "bd23aa5c-e532-4e53-bccf-e79f130c5ae8" (UID: "bd23aa5c-e532-4e53-bccf-e79f130c5ae8"). InnerVolumeSpecName "kube-api-access-jhbk2". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.370011 4724 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" (OuterVolumeSpecName: "kube-api-access-nzwt7") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "kube-api-access-nzwt7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.370254 4724 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" (OuterVolumeSpecName: "kube-api-access-zgdk5") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "kube-api-access-zgdk5". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.370401 4724 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" (OuterVolumeSpecName: "kube-api-access-qg5z5") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "kube-api-access-qg5z5". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.370453 4724 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" (OuterVolumeSpecName: "kube-api-access-lzf88") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "kube-api-access-lzf88". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.370691 4724 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" (OuterVolumeSpecName: "kube-api-access-htfz6") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "kube-api-access-htfz6". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.370521 4724 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.370946 4724 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" (OuterVolumeSpecName: "kube-api-access-kfwg7") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "kube-api-access-kfwg7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.371368 4724 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.371153 4724 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" (OuterVolumeSpecName: "images") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.372184 4724 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" (OuterVolumeSpecName: "kube-api-access-x7zkh") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "kube-api-access-x7zkh". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.372188 4724 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.372314 4724 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" (OuterVolumeSpecName: "kube-api-access-6ccd8") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "kube-api-access-6ccd8". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.372482 4724 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" (OuterVolumeSpecName: "image-registry-operator-tls") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "image-registry-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.372603 4724 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" (OuterVolumeSpecName: "kube-api-access-zkvpv") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "kube-api-access-zkvpv". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.372944 4724 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.373037 4724 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" (OuterVolumeSpecName: "client-ca") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.374130 4724 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" (OuterVolumeSpecName: "console-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.374196 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.374925 4724 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" (OuterVolumeSpecName: "config") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.375035 4724 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" (OuterVolumeSpecName: "config") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.375150 4724 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" (OuterVolumeSpecName: "config") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.375878 4724 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" (OuterVolumeSpecName: "config") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.376650 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:19Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.378329 4724 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.385162 4724 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.388309 4724 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.394983 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.406096 4724 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.406901 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:19Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.410683 4724 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.421685 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.434807 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.434874 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.434994 4724 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") on node \"crc\" DevicePath \"\"" Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.435014 4724 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.435028 4724 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") on node \"crc\" DevicePath \"\"" Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.435041 4724 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") on node \"crc\" DevicePath \"\"" Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.435053 4724 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") on node \"crc\" DevicePath \"\"" Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.435121 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.435216 4724 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.435264 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.435519 4724 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.435547 4724 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") on node \"crc\" DevicePath \"\"" Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.435561 4724 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.435571 4724 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") on node \"crc\" DevicePath \"\"" Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.435584 4724 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") on node \"crc\" DevicePath \"\"" Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.435681 4724 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") on node \"crc\" DevicePath \"\"" Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.435694 4724 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.435708 4724 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") on node \"crc\" DevicePath \"\"" Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.435723 4724 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.435734 4724 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.435744 4724 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") on node \"crc\" DevicePath \"\"" Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.435755 4724 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") on node \"crc\" DevicePath \"\"" Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.435765 4724 reconciler_common.go:293] "Volume detached for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") on node \"crc\" DevicePath \"\"" Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.435776 4724 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") on node \"crc\" DevicePath \"\"" Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.435786 4724 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") on node \"crc\" DevicePath \"\"" Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.435796 4724 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") on node \"crc\" DevicePath \"\"" Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.435805 4724 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") on node \"crc\" DevicePath \"\"" Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.435814 4724 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.435825 4724 reconciler_common.go:293] "Volume detached for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") on node \"crc\" DevicePath \"\"" Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.435834 4724 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") on node \"crc\" DevicePath \"\"" Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.435843 4724 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.435852 4724 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") on node \"crc\" DevicePath \"\"" Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.435860 4724 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") on node \"crc\" DevicePath \"\"" Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.435899 4724 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.435908 4724 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") on node \"crc\" DevicePath \"\"" Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.435917 4724 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.435925 4724 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") on node \"crc\" DevicePath \"\"" Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.435934 4724 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.435943 4724 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") on node \"crc\" DevicePath \"\"" Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.435958 4724 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.435967 4724 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") on node \"crc\" DevicePath \"\"" Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.435977 4724 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") on node \"crc\" DevicePath \"\"" Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.435985 4724 reconciler_common.go:293] "Volume detached for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.435997 4724 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") on node \"crc\" DevicePath \"\"" Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.436007 4724 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") on node \"crc\" DevicePath \"\"" Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.436017 4724 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") on node \"crc\" DevicePath \"\"" Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.436026 4724 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") on node \"crc\" DevicePath \"\"" Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.436034 4724 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") on node \"crc\" DevicePath \"\"" Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.436043 4724 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.436054 4724 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.436633 4724 reconciler_common.go:293] "Volume detached for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.436647 4724 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") on node \"crc\" DevicePath \"\"" Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.436657 4724 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.436666 4724 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") on node \"crc\" DevicePath \"\"" Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.436678 4724 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") on node \"crc\" DevicePath \"\"" Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.436687 4724 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.436696 4724 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") on node \"crc\" DevicePath \"\"" Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.436708 4724 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") on node \"crc\" DevicePath \"\"" Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.436716 4724 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") on node \"crc\" DevicePath \"\"" Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.436727 4724 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") on node \"crc\" DevicePath \"\"" Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.436736 4724 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") on node \"crc\" DevicePath \"\"" Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.436744 4724 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") on node \"crc\" DevicePath \"\"" Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.436805 4724 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") on node \"crc\" DevicePath \"\"" Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.436815 4724 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") on node \"crc\" DevicePath \"\"" Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.436825 4724 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") on node \"crc\" DevicePath \"\"" Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.436834 4724 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") on node \"crc\" DevicePath \"\"" Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.436842 4724 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") on node \"crc\" DevicePath \"\"" Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.436851 4724 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.436860 4724 reconciler_common.go:293] "Volume detached for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") on node \"crc\" DevicePath \"\"" Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.436872 4724 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.436882 4724 reconciler_common.go:293] "Volume detached for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") on node \"crc\" DevicePath \"\"" Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.436890 4724 reconciler_common.go:293] "Volume detached for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") on node \"crc\" DevicePath \"\"" Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.436900 4724 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") on node \"crc\" DevicePath \"\"" Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.436909 4724 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.436918 4724 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") on node \"crc\" DevicePath \"\"" Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.436927 4724 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") on node \"crc\" DevicePath \"\"" Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.436938 4724 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") on node \"crc\" DevicePath \"\"" Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.436948 4724 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.436957 4724 reconciler_common.go:293] "Volume detached for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") on node \"crc\" DevicePath \"\"" Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.436968 4724 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.436981 4724 reconciler_common.go:293] "Volume detached for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") on node \"crc\" DevicePath \"\"" Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.436993 4724 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.437005 4724 reconciler_common.go:293] "Volume detached for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") on node \"crc\" DevicePath \"\"" Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.437019 4724 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") on node \"crc\" DevicePath \"\"" Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.437031 4724 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") on node \"crc\" DevicePath \"\"" Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.437041 4724 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") on node \"crc\" DevicePath \"\"" Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.437052 4724 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") on node \"crc\" DevicePath \"\"" Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.437065 4724 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.437076 4724 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") on node \"crc\" DevicePath \"\"" Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.437087 4724 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") on node \"crc\" DevicePath \"\"" Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.437127 4724 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.437138 4724 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") on node \"crc\" DevicePath \"\"" Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.437149 4724 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") on node \"crc\" DevicePath \"\"" Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.437160 4724 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.437170 4724 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") on node \"crc\" DevicePath \"\"" Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.437182 4724 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") on node \"crc\" DevicePath \"\"" Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.437192 4724 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") on node \"crc\" DevicePath \"\"" Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.437210 4724 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") on node \"crc\" DevicePath \"\"" Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.437220 4724 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.437230 4724 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") on node \"crc\" DevicePath \"\"" Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.437241 4724 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.437252 4724 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.437262 4724 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") on node \"crc\" DevicePath \"\"" Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.437272 4724 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") on node \"crc\" DevicePath \"\"" Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.437281 4724 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") on node \"crc\" DevicePath \"\"" Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.457558 4724 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.458209 4724 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.461829 4724 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="8a3585f00b9d5a19821986d13b0d46f0db3bdd6eef3b4e28f3a63449e0aa0e55" exitCode=255 Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.461914 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"8a3585f00b9d5a19821986d13b0d46f0db3bdd6eef3b4e28f3a63449e0aa0e55"} Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.462029 4724 scope.go:117] "RemoveContainer" containerID="6dc26ffb7c05b14a4c279727262da34a779313ae11888d9c01f8d08fbf705031" Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.475571 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.480180 4724 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.480436 4724 scope.go:117] "RemoveContainer" containerID="8a3585f00b9d5a19821986d13b0d46f0db3bdd6eef3b4e28f3a63449e0aa0e55" Oct 02 12:59:19 crc kubenswrapper[4724]: E1002 12:59:19.480803 4724 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.489939 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:19Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.502302 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.514562 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:19Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.526438 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:19Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.537000 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:19Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.566557 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.574768 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.583647 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 02 12:59:19 crc kubenswrapper[4724]: W1002 12:59:19.589789 4724 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd75a4c96_2883_4a0b_bab2_0fab2b6c0b49.slice/crio-f8d2a5eaebb4d344375ce7d9a7f916f1512fc097c2d1b21056d72bace54c3317 WatchSource:0}: Error finding container f8d2a5eaebb4d344375ce7d9a7f916f1512fc097c2d1b21056d72bace54c3317: Status 404 returned error can't find the container with id f8d2a5eaebb4d344375ce7d9a7f916f1512fc097c2d1b21056d72bace54c3317 Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.841470 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.841576 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 12:59:19 crc kubenswrapper[4724]: E1002 12:59:19.841701 4724 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Oct 02 12:59:19 crc kubenswrapper[4724]: E1002 12:59:19.841737 4724 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 12:59:20.84169179 +0000 UTC m=+25.296450911 (durationBeforeRetry 1s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 12:59:19 crc kubenswrapper[4724]: E1002 12:59:19.841781 4724 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-02 12:59:20.841771592 +0000 UTC m=+25.296530713 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.942687 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.942761 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 12:59:19 crc kubenswrapper[4724]: I1002 12:59:19.942791 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 12:59:19 crc kubenswrapper[4724]: E1002 12:59:19.942909 4724 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 02 12:59:19 crc kubenswrapper[4724]: E1002 12:59:19.942967 4724 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 02 12:59:19 crc kubenswrapper[4724]: E1002 12:59:19.942987 4724 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 02 12:59:19 crc kubenswrapper[4724]: E1002 12:59:19.943002 4724 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 02 12:59:19 crc kubenswrapper[4724]: E1002 12:59:19.943030 4724 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-02 12:59:20.943006794 +0000 UTC m=+25.397765915 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 02 12:59:19 crc kubenswrapper[4724]: E1002 12:59:19.943052 4724 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-10-02 12:59:20.943041475 +0000 UTC m=+25.397800596 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 02 12:59:19 crc kubenswrapper[4724]: E1002 12:59:19.943047 4724 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 02 12:59:19 crc kubenswrapper[4724]: E1002 12:59:19.943108 4724 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 02 12:59:19 crc kubenswrapper[4724]: E1002 12:59:19.943121 4724 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 02 12:59:19 crc kubenswrapper[4724]: E1002 12:59:19.943201 4724 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-10-02 12:59:20.943181048 +0000 UTC m=+25.397940169 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 02 12:59:20 crc kubenswrapper[4724]: I1002 12:59:20.316979 4724 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01ab3dd5-8196-46d0-ad33-122e2ca51def" path="/var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes" Oct 02 12:59:20 crc kubenswrapper[4724]: I1002 12:59:20.317788 4724 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" path="/var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes" Oct 02 12:59:20 crc kubenswrapper[4724]: I1002 12:59:20.319189 4724 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09efc573-dbb6-4249-bd59-9b87aba8dd28" path="/var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes" Oct 02 12:59:20 crc kubenswrapper[4724]: I1002 12:59:20.319989 4724 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b574797-001e-440a-8f4e-c0be86edad0f" path="/var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes" Oct 02 12:59:20 crc kubenswrapper[4724]: I1002 12:59:20.321200 4724 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b78653f-4ff9-4508-8672-245ed9b561e3" path="/var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes" Oct 02 12:59:20 crc kubenswrapper[4724]: I1002 12:59:20.321828 4724 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1386a44e-36a2-460c-96d0-0359d2b6f0f5" path="/var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes" Oct 02 12:59:20 crc kubenswrapper[4724]: I1002 12:59:20.322476 4724 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1bf7eb37-55a3-4c65-b768-a94c82151e69" path="/var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes" Oct 02 12:59:20 crc kubenswrapper[4724]: I1002 12:59:20.323390 4724 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d611f23-29be-4491-8495-bee1670e935f" path="/var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes" Oct 02 12:59:20 crc kubenswrapper[4724]: I1002 12:59:20.324071 4724 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20b0d48f-5fd6-431c-a545-e3c800c7b866" path="/var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/volumes" Oct 02 12:59:20 crc kubenswrapper[4724]: I1002 12:59:20.325292 4724 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" path="/var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes" Oct 02 12:59:20 crc kubenswrapper[4724]: I1002 12:59:20.325885 4724 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22c825df-677d-4ca6-82db-3454ed06e783" path="/var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes" Oct 02 12:59:20 crc kubenswrapper[4724]: I1002 12:59:20.327169 4724 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25e176fe-21b4-4974-b1ed-c8b94f112a7f" path="/var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes" Oct 02 12:59:20 crc kubenswrapper[4724]: I1002 12:59:20.327902 4724 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" path="/var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes" Oct 02 12:59:20 crc kubenswrapper[4724]: I1002 12:59:20.328934 4724 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31d8b7a1-420e-4252-a5b7-eebe8a111292" path="/var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes" Oct 02 12:59:20 crc kubenswrapper[4724]: I1002 12:59:20.329860 4724 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ab1a177-2de0-46d9-b765-d0d0649bb42e" path="/var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/volumes" Oct 02 12:59:20 crc kubenswrapper[4724]: I1002 12:59:20.330353 4724 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" path="/var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes" Oct 02 12:59:20 crc kubenswrapper[4724]: I1002 12:59:20.331269 4724 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43509403-f426-496e-be36-56cef71462f5" path="/var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes" Oct 02 12:59:20 crc kubenswrapper[4724]: I1002 12:59:20.331684 4724 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44663579-783b-4372-86d6-acf235a62d72" path="/var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/volumes" Oct 02 12:59:20 crc kubenswrapper[4724]: I1002 12:59:20.332215 4724 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="496e6271-fb68-4057-954e-a0d97a4afa3f" path="/var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes" Oct 02 12:59:20 crc kubenswrapper[4724]: I1002 12:59:20.333560 4724 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" path="/var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes" Oct 02 12:59:20 crc kubenswrapper[4724]: I1002 12:59:20.334201 4724 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49ef4625-1d3a-4a9f-b595-c2433d32326d" path="/var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/volumes" Oct 02 12:59:20 crc kubenswrapper[4724]: I1002 12:59:20.334832 4724 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4bb40260-dbaa-4fb0-84df-5e680505d512" path="/var/lib/kubelet/pods/4bb40260-dbaa-4fb0-84df-5e680505d512/volumes" Oct 02 12:59:20 crc kubenswrapper[4724]: I1002 12:59:20.335320 4724 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5225d0e4-402f-4861-b410-819f433b1803" path="/var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes" Oct 02 12:59:20 crc kubenswrapper[4724]: I1002 12:59:20.336031 4724 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5441d097-087c-4d9a-baa8-b210afa90fc9" path="/var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes" Oct 02 12:59:20 crc kubenswrapper[4724]: I1002 12:59:20.336429 4724 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57a731c4-ef35-47a8-b875-bfb08a7f8011" path="/var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes" Oct 02 12:59:20 crc kubenswrapper[4724]: I1002 12:59:20.337031 4724 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b88f790-22fa-440e-b583-365168c0b23d" path="/var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/volumes" Oct 02 12:59:20 crc kubenswrapper[4724]: I1002 12:59:20.338509 4724 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5fe579f8-e8a6-4643-bce5-a661393c4dde" path="/var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/volumes" Oct 02 12:59:20 crc kubenswrapper[4724]: I1002 12:59:20.339392 4724 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6402fda4-df10-493c-b4e5-d0569419652d" path="/var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes" Oct 02 12:59:20 crc kubenswrapper[4724]: I1002 12:59:20.340209 4724 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6509e943-70c6-444c-bc41-48a544e36fbd" path="/var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes" Oct 02 12:59:20 crc kubenswrapper[4724]: I1002 12:59:20.341011 4724 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6731426b-95fe-49ff-bb5f-40441049fde2" path="/var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/volumes" Oct 02 12:59:20 crc kubenswrapper[4724]: I1002 12:59:20.341631 4724 kubelet_volumes.go:152] "Cleaned up orphaned volume subpath from pod" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volume-subpaths/run-systemd/ovnkube-controller/6" Oct 02 12:59:20 crc kubenswrapper[4724]: I1002 12:59:20.341760 4724 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volumes" Oct 02 12:59:20 crc kubenswrapper[4724]: I1002 12:59:20.344709 4724 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7539238d-5fe0-46ed-884e-1c3b566537ec" path="/var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes" Oct 02 12:59:20 crc kubenswrapper[4724]: I1002 12:59:20.345974 4724 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7583ce53-e0fe-4a16-9e4d-50516596a136" path="/var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes" Oct 02 12:59:20 crc kubenswrapper[4724]: I1002 12:59:20.346388 4724 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7bb08738-c794-4ee8-9972-3a62ca171029" path="/var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes" Oct 02 12:59:20 crc kubenswrapper[4724]: I1002 12:59:20.348001 4724 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87cf06ed-a83f-41a7-828d-70653580a8cb" path="/var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes" Oct 02 12:59:20 crc kubenswrapper[4724]: I1002 12:59:20.349028 4724 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" path="/var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes" Oct 02 12:59:20 crc kubenswrapper[4724]: I1002 12:59:20.349585 4724 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="925f1c65-6136-48ba-85aa-3a3b50560753" path="/var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes" Oct 02 12:59:20 crc kubenswrapper[4724]: I1002 12:59:20.351155 4724 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" path="/var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/volumes" Oct 02 12:59:20 crc kubenswrapper[4724]: I1002 12:59:20.352072 4724 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d4552c7-cd75-42dd-8880-30dd377c49a4" path="/var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes" Oct 02 12:59:20 crc kubenswrapper[4724]: I1002 12:59:20.352561 4724 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" path="/var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/volumes" Oct 02 12:59:20 crc kubenswrapper[4724]: I1002 12:59:20.353173 4724 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a31745f5-9847-4afe-82a5-3161cc66ca93" path="/var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes" Oct 02 12:59:20 crc kubenswrapper[4724]: I1002 12:59:20.353827 4724 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" path="/var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes" Oct 02 12:59:20 crc kubenswrapper[4724]: I1002 12:59:20.354426 4724 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6312bbd-5731-4ea0-a20f-81d5a57df44a" path="/var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/volumes" Oct 02 12:59:20 crc kubenswrapper[4724]: I1002 12:59:20.356301 4724 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" path="/var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes" Oct 02 12:59:20 crc kubenswrapper[4724]: I1002 12:59:20.356948 4724 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" path="/var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes" Oct 02 12:59:20 crc kubenswrapper[4724]: I1002 12:59:20.358048 4724 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" path="/var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/volumes" Oct 02 12:59:20 crc kubenswrapper[4724]: I1002 12:59:20.359027 4724 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf126b07-da06-4140-9a57-dfd54fc6b486" path="/var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes" Oct 02 12:59:20 crc kubenswrapper[4724]: I1002 12:59:20.359926 4724 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c03ee662-fb2f-4fc4-a2c1-af487c19d254" path="/var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes" Oct 02 12:59:20 crc kubenswrapper[4724]: I1002 12:59:20.360473 4724 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" path="/var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/volumes" Oct 02 12:59:20 crc kubenswrapper[4724]: I1002 12:59:20.361569 4724 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7e6199b-1264-4501-8953-767f51328d08" path="/var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes" Oct 02 12:59:20 crc kubenswrapper[4724]: I1002 12:59:20.362198 4724 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="efdd0498-1daa-4136-9a4a-3b948c2293fc" path="/var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/volumes" Oct 02 12:59:20 crc kubenswrapper[4724]: I1002 12:59:20.362886 4724 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" path="/var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/volumes" Oct 02 12:59:20 crc kubenswrapper[4724]: I1002 12:59:20.363912 4724 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fda69060-fa79-4696-b1a6-7980f124bf7c" path="/var/lib/kubelet/pods/fda69060-fa79-4696-b1a6-7980f124bf7c/volumes" Oct 02 12:59:20 crc kubenswrapper[4724]: I1002 12:59:20.466412 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"8902618f9e9502ff68d0658e199abb68025c551774c41022ed9f3f15dcccba47"} Oct 02 12:59:20 crc kubenswrapper[4724]: I1002 12:59:20.466477 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"974ec385e63f813e91f81532b452822fdb6bdd249613ead02dbec29b5417cb8a"} Oct 02 12:59:20 crc kubenswrapper[4724]: I1002 12:59:20.467982 4724 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Oct 02 12:59:20 crc kubenswrapper[4724]: I1002 12:59:20.470728 4724 scope.go:117] "RemoveContainer" containerID="8a3585f00b9d5a19821986d13b0d46f0db3bdd6eef3b4e28f3a63449e0aa0e55" Oct 02 12:59:20 crc kubenswrapper[4724]: E1002 12:59:20.470929 4724 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Oct 02 12:59:20 crc kubenswrapper[4724]: I1002 12:59:20.472338 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"16ca55ea7e67fb6219b1c835d960d1f9ec6a18a98789ee961e44a278fad9add7"} Oct 02 12:59:20 crc kubenswrapper[4724]: I1002 12:59:20.472375 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"72eefb1c1aad15d7ad34aa4384c7d0a20a4c886225cb7d28f0597cf01ad35693"} Oct 02 12:59:20 crc kubenswrapper[4724]: I1002 12:59:20.472390 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"e581f43a8e6674b40114c0629eeb11966b5b7b762560da4449334f2a744e5b0b"} Oct 02 12:59:20 crc kubenswrapper[4724]: I1002 12:59:20.474203 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"f8d2a5eaebb4d344375ce7d9a7f916f1512fc097c2d1b21056d72bace54c3317"} Oct 02 12:59:20 crc kubenswrapper[4724]: I1002 12:59:20.481668 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:19Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:20Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:20 crc kubenswrapper[4724]: I1002 12:59:20.498670 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8902618f9e9502ff68d0658e199abb68025c551774c41022ed9f3f15dcccba47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:20Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:20 crc kubenswrapper[4724]: I1002 12:59:20.512014 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:19Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:20Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:20 crc kubenswrapper[4724]: I1002 12:59:20.528344 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:20Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:20 crc kubenswrapper[4724]: I1002 12:59:20.541621 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:19Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:20Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:20 crc kubenswrapper[4724]: I1002 12:59:20.555611 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:19Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:20Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:20 crc kubenswrapper[4724]: I1002 12:59:20.583824 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1527f3c8-7fa1-404b-aafd-7cac512df49d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:58:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:58:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:58:56Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:58:56Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:58:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f058a4a1cf35b7f4800f872139322f2b0096d67548945ba03ecfe6775ffd5103\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:58:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c06801afff9aa03ae9e0bcd8a966f454d81fdfe876806eed22e9d93132989dae\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:58:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff4c45bdd946a4d50c7dc93752d3d1a71b10b830611b64e138a5e1c0eb9e5e30\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:58:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a3585f00b9d5a19821986d13b0d46f0db3bdd6eef3b4e28f3a63449e0aa0e55\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6dc26ffb7c05b14a4c279727262da34a779313ae11888d9c01f8d08fbf705031\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-02T12:59:01Z\\\",\\\"message\\\":\\\"W1002 12:58:59.821240 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1002 12:58:59.821978 1 crypto.go:601] Generating new CA for check-endpoints-signer@1759409939 cert, and key in /tmp/serving-cert-3456327689/serving-signer.crt, /tmp/serving-cert-3456327689/serving-signer.key\\\\nI1002 12:59:01.219327 1 observer_polling.go:159] Starting file observer\\\\nW1002 12:59:01.225110 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1002 12:59:01.225280 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1002 12:59:01.225946 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3456327689/tls.crt::/tmp/serving-cert-3456327689/tls.key\\\\\\\"\\\\nF1002 12:59:01.684385 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T12:58:59Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a3585f00b9d5a19821986d13b0d46f0db3bdd6eef3b4e28f3a63449e0aa0e55\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-02T12:59:19Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1002 12:59:13.975124 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1002 12:59:13.976423 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1894517521/tls.crt::/tmp/serving-cert-1894517521/tls.key\\\\\\\"\\\\nI1002 12:59:19.332110 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1002 12:59:19.335107 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1002 12:59:19.335132 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1002 12:59:19.335161 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1002 12:59:19.335167 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1002 12:59:19.339404 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1002 12:59:19.339433 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 12:59:19.339437 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 12:59:19.339442 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1002 12:59:19.339446 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1002 12:59:19.339452 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1002 12:59:19.339454 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1002 12:59:19.339646 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1002 12:59:19.342508 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T12:59:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://25e74bd1660931f7bb8dc824f985c539abcbd4c706a11edf7192876b8796e751\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:58:59Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://366ed0b9a29ba0b7606888087909b51b39abd60dd6b11e6509b8613913461b6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://366ed0b9a29ba0b7606888087909b51b39abd60dd6b11e6509b8613913461b6b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T12:58:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T12:58:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T12:58:56Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:20Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:20 crc kubenswrapper[4724]: I1002 12:59:20.601941 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1527f3c8-7fa1-404b-aafd-7cac512df49d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:58:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:58:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:58:56Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:58:56Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:58:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f058a4a1cf35b7f4800f872139322f2b0096d67548945ba03ecfe6775ffd5103\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:58:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c06801afff9aa03ae9e0bcd8a966f454d81fdfe876806eed22e9d93132989dae\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:58:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff4c45bdd946a4d50c7dc93752d3d1a71b10b830611b64e138a5e1c0eb9e5e30\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:58:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a3585f00b9d5a19821986d13b0d46f0db3bdd6eef3b4e28f3a63449e0aa0e55\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a3585f00b9d5a19821986d13b0d46f0db3bdd6eef3b4e28f3a63449e0aa0e55\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-02T12:59:19Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1002 12:59:13.975124 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1002 12:59:13.976423 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1894517521/tls.crt::/tmp/serving-cert-1894517521/tls.key\\\\\\\"\\\\nI1002 12:59:19.332110 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1002 12:59:19.335107 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1002 12:59:19.335132 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1002 12:59:19.335161 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1002 12:59:19.335167 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1002 12:59:19.339404 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1002 12:59:19.339433 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 12:59:19.339437 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 12:59:19.339442 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1002 12:59:19.339446 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1002 12:59:19.339452 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1002 12:59:19.339454 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1002 12:59:19.339646 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1002 12:59:19.342508 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T12:59:02Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://25e74bd1660931f7bb8dc824f985c539abcbd4c706a11edf7192876b8796e751\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:58:59Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://366ed0b9a29ba0b7606888087909b51b39abd60dd6b11e6509b8613913461b6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://366ed0b9a29ba0b7606888087909b51b39abd60dd6b11e6509b8613913461b6b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T12:58:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T12:58:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T12:58:56Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:20Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:20 crc kubenswrapper[4724]: I1002 12:59:20.617804 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:19Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:20Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:20 crc kubenswrapper[4724]: I1002 12:59:20.633702 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:20Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:20 crc kubenswrapper[4724]: I1002 12:59:20.667065 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://16ca55ea7e67fb6219b1c835d960d1f9ec6a18a98789ee961e44a278fad9add7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://72eefb1c1aad15d7ad34aa4384c7d0a20a4c886225cb7d28f0597cf01ad35693\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:20Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:20 crc kubenswrapper[4724]: I1002 12:59:20.682424 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:19Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:20Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:20 crc kubenswrapper[4724]: I1002 12:59:20.713477 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:19Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:20Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:20 crc kubenswrapper[4724]: I1002 12:59:20.754766 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8902618f9e9502ff68d0658e199abb68025c551774c41022ed9f3f15dcccba47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:20Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:20 crc kubenswrapper[4724]: I1002 12:59:20.851238 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 12:59:20 crc kubenswrapper[4724]: I1002 12:59:20.851332 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 12:59:20 crc kubenswrapper[4724]: E1002 12:59:20.851515 4724 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Oct 02 12:59:20 crc kubenswrapper[4724]: E1002 12:59:20.851525 4724 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 12:59:22.851458433 +0000 UTC m=+27.306217564 (durationBeforeRetry 2s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 12:59:20 crc kubenswrapper[4724]: E1002 12:59:20.851605 4724 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-02 12:59:22.851586406 +0000 UTC m=+27.306345527 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Oct 02 12:59:20 crc kubenswrapper[4724]: I1002 12:59:20.952833 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 12:59:20 crc kubenswrapper[4724]: I1002 12:59:20.952894 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 12:59:20 crc kubenswrapper[4724]: I1002 12:59:20.952928 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 12:59:20 crc kubenswrapper[4724]: E1002 12:59:20.953098 4724 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 02 12:59:20 crc kubenswrapper[4724]: E1002 12:59:20.953113 4724 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 02 12:59:20 crc kubenswrapper[4724]: E1002 12:59:20.953258 4724 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 02 12:59:20 crc kubenswrapper[4724]: E1002 12:59:20.953275 4724 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 02 12:59:20 crc kubenswrapper[4724]: E1002 12:59:20.953222 4724 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 02 12:59:20 crc kubenswrapper[4724]: E1002 12:59:20.953231 4724 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-02 12:59:22.953199938 +0000 UTC m=+27.407959229 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 02 12:59:20 crc kubenswrapper[4724]: E1002 12:59:20.953335 4724 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 02 12:59:20 crc kubenswrapper[4724]: E1002 12:59:20.953422 4724 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 02 12:59:20 crc kubenswrapper[4724]: E1002 12:59:20.953456 4724 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-10-02 12:59:22.953392873 +0000 UTC m=+27.408151994 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 02 12:59:20 crc kubenswrapper[4724]: E1002 12:59:20.953559 4724 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-10-02 12:59:22.953495385 +0000 UTC m=+27.408254686 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 02 12:59:21 crc kubenswrapper[4724]: I1002 12:59:21.312985 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 12:59:21 crc kubenswrapper[4724]: I1002 12:59:21.313018 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 12:59:21 crc kubenswrapper[4724]: I1002 12:59:21.313080 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 12:59:21 crc kubenswrapper[4724]: E1002 12:59:21.313175 4724 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 02 12:59:21 crc kubenswrapper[4724]: E1002 12:59:21.313280 4724 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 02 12:59:21 crc kubenswrapper[4724]: E1002 12:59:21.313397 4724 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 02 12:59:21 crc kubenswrapper[4724]: I1002 12:59:21.512631 4724 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/node-resolver-2mrjk"] Oct 02 12:59:21 crc kubenswrapper[4724]: I1002 12:59:21.513032 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-2mrjk" Oct 02 12:59:21 crc kubenswrapper[4724]: I1002 12:59:21.515934 4724 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Oct 02 12:59:21 crc kubenswrapper[4724]: I1002 12:59:21.516740 4724 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Oct 02 12:59:21 crc kubenswrapper[4724]: I1002 12:59:21.516895 4724 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Oct 02 12:59:21 crc kubenswrapper[4724]: I1002 12:59:21.535587 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1527f3c8-7fa1-404b-aafd-7cac512df49d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:58:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:58:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:58:56Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:58:56Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:58:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f058a4a1cf35b7f4800f872139322f2b0096d67548945ba03ecfe6775ffd5103\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:58:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c06801afff9aa03ae9e0bcd8a966f454d81fdfe876806eed22e9d93132989dae\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:58:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff4c45bdd946a4d50c7dc93752d3d1a71b10b830611b64e138a5e1c0eb9e5e30\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:58:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a3585f00b9d5a19821986d13b0d46f0db3bdd6eef3b4e28f3a63449e0aa0e55\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a3585f00b9d5a19821986d13b0d46f0db3bdd6eef3b4e28f3a63449e0aa0e55\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-02T12:59:19Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1002 12:59:13.975124 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1002 12:59:13.976423 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1894517521/tls.crt::/tmp/serving-cert-1894517521/tls.key\\\\\\\"\\\\nI1002 12:59:19.332110 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1002 12:59:19.335107 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1002 12:59:19.335132 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1002 12:59:19.335161 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1002 12:59:19.335167 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1002 12:59:19.339404 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1002 12:59:19.339433 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 12:59:19.339437 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 12:59:19.339442 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1002 12:59:19.339446 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1002 12:59:19.339452 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1002 12:59:19.339454 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1002 12:59:19.339646 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1002 12:59:19.342508 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T12:59:02Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://25e74bd1660931f7bb8dc824f985c539abcbd4c706a11edf7192876b8796e751\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:58:59Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://366ed0b9a29ba0b7606888087909b51b39abd60dd6b11e6509b8613913461b6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://366ed0b9a29ba0b7606888087909b51b39abd60dd6b11e6509b8613913461b6b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T12:58:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T12:58:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T12:58:56Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:21Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:21 crc kubenswrapper[4724]: I1002 12:59:21.547850 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-2mrjk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e0d209f5-ad55-48f5-b4de-51aa5a972c19\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:21Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:21Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8f48\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T12:59:21Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-2mrjk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:21Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:21 crc kubenswrapper[4724]: I1002 12:59:21.564897 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8902618f9e9502ff68d0658e199abb68025c551774c41022ed9f3f15dcccba47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:21Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:21 crc kubenswrapper[4724]: I1002 12:59:21.581200 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:19Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:21Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:21 crc kubenswrapper[4724]: I1002 12:59:21.596961 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:21Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:21 crc kubenswrapper[4724]: I1002 12:59:21.614315 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://16ca55ea7e67fb6219b1c835d960d1f9ec6a18a98789ee961e44a278fad9add7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://72eefb1c1aad15d7ad34aa4384c7d0a20a4c886225cb7d28f0597cf01ad35693\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:21Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:21 crc kubenswrapper[4724]: I1002 12:59:21.630194 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:19Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:21Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:21 crc kubenswrapper[4724]: I1002 12:59:21.646269 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:19Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:21Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:21 crc kubenswrapper[4724]: I1002 12:59:21.659813 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b8f48\" (UniqueName: \"kubernetes.io/projected/e0d209f5-ad55-48f5-b4de-51aa5a972c19-kube-api-access-b8f48\") pod \"node-resolver-2mrjk\" (UID: \"e0d209f5-ad55-48f5-b4de-51aa5a972c19\") " pod="openshift-dns/node-resolver-2mrjk" Oct 02 12:59:21 crc kubenswrapper[4724]: I1002 12:59:21.659889 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/e0d209f5-ad55-48f5-b4de-51aa5a972c19-hosts-file\") pod \"node-resolver-2mrjk\" (UID: \"e0d209f5-ad55-48f5-b4de-51aa5a972c19\") " pod="openshift-dns/node-resolver-2mrjk" Oct 02 12:59:21 crc kubenswrapper[4724]: I1002 12:59:21.761397 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/e0d209f5-ad55-48f5-b4de-51aa5a972c19-hosts-file\") pod \"node-resolver-2mrjk\" (UID: \"e0d209f5-ad55-48f5-b4de-51aa5a972c19\") " pod="openshift-dns/node-resolver-2mrjk" Oct 02 12:59:21 crc kubenswrapper[4724]: I1002 12:59:21.761503 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b8f48\" (UniqueName: \"kubernetes.io/projected/e0d209f5-ad55-48f5-b4de-51aa5a972c19-kube-api-access-b8f48\") pod \"node-resolver-2mrjk\" (UID: \"e0d209f5-ad55-48f5-b4de-51aa5a972c19\") " pod="openshift-dns/node-resolver-2mrjk" Oct 02 12:59:21 crc kubenswrapper[4724]: I1002 12:59:21.761557 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/e0d209f5-ad55-48f5-b4de-51aa5a972c19-hosts-file\") pod \"node-resolver-2mrjk\" (UID: \"e0d209f5-ad55-48f5-b4de-51aa5a972c19\") " pod="openshift-dns/node-resolver-2mrjk" Oct 02 12:59:21 crc kubenswrapper[4724]: I1002 12:59:21.795424 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b8f48\" (UniqueName: \"kubernetes.io/projected/e0d209f5-ad55-48f5-b4de-51aa5a972c19-kube-api-access-b8f48\") pod \"node-resolver-2mrjk\" (UID: \"e0d209f5-ad55-48f5-b4de-51aa5a972c19\") " pod="openshift-dns/node-resolver-2mrjk" Oct 02 12:59:21 crc kubenswrapper[4724]: I1002 12:59:21.825830 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-2mrjk" Oct 02 12:59:21 crc kubenswrapper[4724]: W1002 12:59:21.841500 4724 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode0d209f5_ad55_48f5_b4de_51aa5a972c19.slice/crio-76fe771f079eb5e0f551b36f27613c73fc092a61bf1192d8f7d12b43cd08f762 WatchSource:0}: Error finding container 76fe771f079eb5e0f551b36f27613c73fc092a61bf1192d8f7d12b43cd08f762: Status 404 returned error can't find the container with id 76fe771f079eb5e0f551b36f27613c73fc092a61bf1192d8f7d12b43cd08f762 Oct 02 12:59:21 crc kubenswrapper[4724]: I1002 12:59:21.888440 4724 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-additional-cni-plugins-829dv"] Oct 02 12:59:21 crc kubenswrapper[4724]: I1002 12:59:21.889187 4724 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-pr276"] Oct 02 12:59:21 crc kubenswrapper[4724]: I1002 12:59:21.889343 4724 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-w58lt"] Oct 02 12:59:21 crc kubenswrapper[4724]: I1002 12:59:21.889425 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-pr276" Oct 02 12:59:21 crc kubenswrapper[4724]: I1002 12:59:21.889425 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-829dv" Oct 02 12:59:21 crc kubenswrapper[4724]: I1002 12:59:21.890445 4724 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-daemon-74k4t"] Oct 02 12:59:21 crc kubenswrapper[4724]: I1002 12:59:21.891778 4724 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Oct 02 12:59:21 crc kubenswrapper[4724]: I1002 12:59:21.891928 4724 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Oct 02 12:59:21 crc kubenswrapper[4724]: I1002 12:59:21.892146 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-w58lt" Oct 02 12:59:21 crc kubenswrapper[4724]: I1002 12:59:21.892370 4724 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Oct 02 12:59:21 crc kubenswrapper[4724]: I1002 12:59:21.892476 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-74k4t" Oct 02 12:59:21 crc kubenswrapper[4724]: I1002 12:59:21.894376 4724 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Oct 02 12:59:21 crc kubenswrapper[4724]: I1002 12:59:21.894425 4724 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Oct 02 12:59:21 crc kubenswrapper[4724]: I1002 12:59:21.894652 4724 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Oct 02 12:59:21 crc kubenswrapper[4724]: I1002 12:59:21.895106 4724 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Oct 02 12:59:21 crc kubenswrapper[4724]: I1002 12:59:21.896826 4724 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Oct 02 12:59:21 crc kubenswrapper[4724]: I1002 12:59:21.897064 4724 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Oct 02 12:59:21 crc kubenswrapper[4724]: I1002 12:59:21.897412 4724 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Oct 02 12:59:21 crc kubenswrapper[4724]: I1002 12:59:21.897663 4724 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Oct 02 12:59:21 crc kubenswrapper[4724]: I1002 12:59:21.897908 4724 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Oct 02 12:59:21 crc kubenswrapper[4724]: I1002 12:59:21.899338 4724 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Oct 02 12:59:21 crc kubenswrapper[4724]: I1002 12:59:21.899812 4724 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Oct 02 12:59:21 crc kubenswrapper[4724]: I1002 12:59:21.900103 4724 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Oct 02 12:59:21 crc kubenswrapper[4724]: I1002 12:59:21.900264 4724 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Oct 02 12:59:21 crc kubenswrapper[4724]: I1002 12:59:21.901388 4724 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Oct 02 12:59:21 crc kubenswrapper[4724]: I1002 12:59:21.901432 4724 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Oct 02 12:59:21 crc kubenswrapper[4724]: I1002 12:59:21.902303 4724 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Oct 02 12:59:21 crc kubenswrapper[4724]: I1002 12:59:21.911413 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1527f3c8-7fa1-404b-aafd-7cac512df49d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:58:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:58:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:58:56Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:58:56Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:58:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f058a4a1cf35b7f4800f872139322f2b0096d67548945ba03ecfe6775ffd5103\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:58:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c06801afff9aa03ae9e0bcd8a966f454d81fdfe876806eed22e9d93132989dae\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:58:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff4c45bdd946a4d50c7dc93752d3d1a71b10b830611b64e138a5e1c0eb9e5e30\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:58:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a3585f00b9d5a19821986d13b0d46f0db3bdd6eef3b4e28f3a63449e0aa0e55\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a3585f00b9d5a19821986d13b0d46f0db3bdd6eef3b4e28f3a63449e0aa0e55\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-02T12:59:19Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1002 12:59:13.975124 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1002 12:59:13.976423 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1894517521/tls.crt::/tmp/serving-cert-1894517521/tls.key\\\\\\\"\\\\nI1002 12:59:19.332110 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1002 12:59:19.335107 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1002 12:59:19.335132 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1002 12:59:19.335161 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1002 12:59:19.335167 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1002 12:59:19.339404 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1002 12:59:19.339433 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 12:59:19.339437 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 12:59:19.339442 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1002 12:59:19.339446 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1002 12:59:19.339452 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1002 12:59:19.339454 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1002 12:59:19.339646 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1002 12:59:19.342508 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T12:59:02Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://25e74bd1660931f7bb8dc824f985c539abcbd4c706a11edf7192876b8796e751\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:58:59Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://366ed0b9a29ba0b7606888087909b51b39abd60dd6b11e6509b8613913461b6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://366ed0b9a29ba0b7606888087909b51b39abd60dd6b11e6509b8613913461b6b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T12:58:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T12:58:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T12:58:56Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:21Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:21 crc kubenswrapper[4724]: I1002 12:59:21.934874 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-pr276" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c67e3c27-01fd-47f5-a7fb-a2ae1e7753d4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c844q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T12:59:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-pr276\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:21Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:21 crc kubenswrapper[4724]: I1002 12:59:21.955169 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8902618f9e9502ff68d0658e199abb68025c551774c41022ed9f3f15dcccba47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:21Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:21 crc kubenswrapper[4724]: I1002 12:59:21.971886 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:21Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:21 crc kubenswrapper[4724]: I1002 12:59:21.985352 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:19Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:21Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:22 crc kubenswrapper[4724]: I1002 12:59:22.001404 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-2mrjk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e0d209f5-ad55-48f5-b4de-51aa5a972c19\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:21Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:21Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8f48\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T12:59:21Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-2mrjk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:21Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:22 crc kubenswrapper[4724]: I1002 12:59:22.021874 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-829dv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b90b9e02-3565-4ad5-8f8c-eec339fc499c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:21Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vjwcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vjwcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vjwcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vjwcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vjwcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vjwcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vjwcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T12:59:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-829dv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:22Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:22 crc kubenswrapper[4724]: I1002 12:59:22.036509 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:19Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:22Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:22 crc kubenswrapper[4724]: I1002 12:59:22.053281 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://16ca55ea7e67fb6219b1c835d960d1f9ec6a18a98789ee961e44a278fad9add7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://72eefb1c1aad15d7ad34aa4384c7d0a20a4c886225cb7d28f0597cf01ad35693\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:22Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:22 crc kubenswrapper[4724]: I1002 12:59:22.066367 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/c67e3c27-01fd-47f5-a7fb-a2ae1e7753d4-multus-socket-dir-parent\") pod \"multus-pr276\" (UID: \"c67e3c27-01fd-47f5-a7fb-a2ae1e7753d4\") " pod="openshift-multus/multus-pr276" Oct 02 12:59:22 crc kubenswrapper[4724]: I1002 12:59:22.066456 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/4089ad23-969c-4222-a8ed-e141ec291e80-env-overrides\") pod \"ovnkube-node-w58lt\" (UID: \"4089ad23-969c-4222-a8ed-e141ec291e80\") " pod="openshift-ovn-kubernetes/ovnkube-node-w58lt" Oct 02 12:59:22 crc kubenswrapper[4724]: I1002 12:59:22.066495 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/c67e3c27-01fd-47f5-a7fb-a2ae1e7753d4-multus-conf-dir\") pod \"multus-pr276\" (UID: \"c67e3c27-01fd-47f5-a7fb-a2ae1e7753d4\") " pod="openshift-multus/multus-pr276" Oct 02 12:59:22 crc kubenswrapper[4724]: I1002 12:59:22.066556 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/f6090eaa-c182-4788-950c-16352c271233-rootfs\") pod \"machine-config-daemon-74k4t\" (UID: \"f6090eaa-c182-4788-950c-16352c271233\") " pod="openshift-machine-config-operator/machine-config-daemon-74k4t" Oct 02 12:59:22 crc kubenswrapper[4724]: I1002 12:59:22.066594 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fb6lr\" (UniqueName: \"kubernetes.io/projected/f6090eaa-c182-4788-950c-16352c271233-kube-api-access-fb6lr\") pod \"machine-config-daemon-74k4t\" (UID: \"f6090eaa-c182-4788-950c-16352c271233\") " pod="openshift-machine-config-operator/machine-config-daemon-74k4t" Oct 02 12:59:22 crc kubenswrapper[4724]: I1002 12:59:22.066630 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/b90b9e02-3565-4ad5-8f8c-eec339fc499c-cnibin\") pod \"multus-additional-cni-plugins-829dv\" (UID: \"b90b9e02-3565-4ad5-8f8c-eec339fc499c\") " pod="openshift-multus/multus-additional-cni-plugins-829dv" Oct 02 12:59:22 crc kubenswrapper[4724]: I1002 12:59:22.066749 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/c67e3c27-01fd-47f5-a7fb-a2ae1e7753d4-os-release\") pod \"multus-pr276\" (UID: \"c67e3c27-01fd-47f5-a7fb-a2ae1e7753d4\") " pod="openshift-multus/multus-pr276" Oct 02 12:59:22 crc kubenswrapper[4724]: I1002 12:59:22.066812 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/c67e3c27-01fd-47f5-a7fb-a2ae1e7753d4-host-run-k8s-cni-cncf-io\") pod \"multus-pr276\" (UID: \"c67e3c27-01fd-47f5-a7fb-a2ae1e7753d4\") " pod="openshift-multus/multus-pr276" Oct 02 12:59:22 crc kubenswrapper[4724]: I1002 12:59:22.066831 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/c67e3c27-01fd-47f5-a7fb-a2ae1e7753d4-host-var-lib-kubelet\") pod \"multus-pr276\" (UID: \"c67e3c27-01fd-47f5-a7fb-a2ae1e7753d4\") " pod="openshift-multus/multus-pr276" Oct 02 12:59:22 crc kubenswrapper[4724]: I1002 12:59:22.066858 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/c67e3c27-01fd-47f5-a7fb-a2ae1e7753d4-multus-daemon-config\") pod \"multus-pr276\" (UID: \"c67e3c27-01fd-47f5-a7fb-a2ae1e7753d4\") " pod="openshift-multus/multus-pr276" Oct 02 12:59:22 crc kubenswrapper[4724]: I1002 12:59:22.066888 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/b90b9e02-3565-4ad5-8f8c-eec339fc499c-os-release\") pod \"multus-additional-cni-plugins-829dv\" (UID: \"b90b9e02-3565-4ad5-8f8c-eec339fc499c\") " pod="openshift-multus/multus-additional-cni-plugins-829dv" Oct 02 12:59:22 crc kubenswrapper[4724]: I1002 12:59:22.066914 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/4089ad23-969c-4222-a8ed-e141ec291e80-run-ovn\") pod \"ovnkube-node-w58lt\" (UID: \"4089ad23-969c-4222-a8ed-e141ec291e80\") " pod="openshift-ovn-kubernetes/ovnkube-node-w58lt" Oct 02 12:59:22 crc kubenswrapper[4724]: I1002 12:59:22.066941 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7sxqx\" (UniqueName: \"kubernetes.io/projected/4089ad23-969c-4222-a8ed-e141ec291e80-kube-api-access-7sxqx\") pod \"ovnkube-node-w58lt\" (UID: \"4089ad23-969c-4222-a8ed-e141ec291e80\") " pod="openshift-ovn-kubernetes/ovnkube-node-w58lt" Oct 02 12:59:22 crc kubenswrapper[4724]: I1002 12:59:22.066963 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/c67e3c27-01fd-47f5-a7fb-a2ae1e7753d4-cni-binary-copy\") pod \"multus-pr276\" (UID: \"c67e3c27-01fd-47f5-a7fb-a2ae1e7753d4\") " pod="openshift-multus/multus-pr276" Oct 02 12:59:22 crc kubenswrapper[4724]: I1002 12:59:22.066987 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/c67e3c27-01fd-47f5-a7fb-a2ae1e7753d4-host-var-lib-cni-multus\") pod \"multus-pr276\" (UID: \"c67e3c27-01fd-47f5-a7fb-a2ae1e7753d4\") " pod="openshift-multus/multus-pr276" Oct 02 12:59:22 crc kubenswrapper[4724]: I1002 12:59:22.067007 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/f6090eaa-c182-4788-950c-16352c271233-mcd-auth-proxy-config\") pod \"machine-config-daemon-74k4t\" (UID: \"f6090eaa-c182-4788-950c-16352c271233\") " pod="openshift-machine-config-operator/machine-config-daemon-74k4t" Oct 02 12:59:22 crc kubenswrapper[4724]: I1002 12:59:22.067059 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/b90b9e02-3565-4ad5-8f8c-eec339fc499c-tuning-conf-dir\") pod \"multus-additional-cni-plugins-829dv\" (UID: \"b90b9e02-3565-4ad5-8f8c-eec339fc499c\") " pod="openshift-multus/multus-additional-cni-plugins-829dv" Oct 02 12:59:22 crc kubenswrapper[4724]: I1002 12:59:22.067080 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/4089ad23-969c-4222-a8ed-e141ec291e80-host-cni-netd\") pod \"ovnkube-node-w58lt\" (UID: \"4089ad23-969c-4222-a8ed-e141ec291e80\") " pod="openshift-ovn-kubernetes/ovnkube-node-w58lt" Oct 02 12:59:22 crc kubenswrapper[4724]: I1002 12:59:22.067182 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4089ad23-969c-4222-a8ed-e141ec291e80-etc-openvswitch\") pod \"ovnkube-node-w58lt\" (UID: \"4089ad23-969c-4222-a8ed-e141ec291e80\") " pod="openshift-ovn-kubernetes/ovnkube-node-w58lt" Oct 02 12:59:22 crc kubenswrapper[4724]: I1002 12:59:22.067202 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4089ad23-969c-4222-a8ed-e141ec291e80-run-openvswitch\") pod \"ovnkube-node-w58lt\" (UID: \"4089ad23-969c-4222-a8ed-e141ec291e80\") " pod="openshift-ovn-kubernetes/ovnkube-node-w58lt" Oct 02 12:59:22 crc kubenswrapper[4724]: I1002 12:59:22.067230 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/c67e3c27-01fd-47f5-a7fb-a2ae1e7753d4-host-run-multus-certs\") pod \"multus-pr276\" (UID: \"c67e3c27-01fd-47f5-a7fb-a2ae1e7753d4\") " pod="openshift-multus/multus-pr276" Oct 02 12:59:22 crc kubenswrapper[4724]: I1002 12:59:22.067106 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:19Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:22Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:22 crc kubenswrapper[4724]: I1002 12:59:22.067268 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/f6090eaa-c182-4788-950c-16352c271233-proxy-tls\") pod \"machine-config-daemon-74k4t\" (UID: \"f6090eaa-c182-4788-950c-16352c271233\") " pod="openshift-machine-config-operator/machine-config-daemon-74k4t" Oct 02 12:59:22 crc kubenswrapper[4724]: I1002 12:59:22.067432 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/b90b9e02-3565-4ad5-8f8c-eec339fc499c-cni-binary-copy\") pod \"multus-additional-cni-plugins-829dv\" (UID: \"b90b9e02-3565-4ad5-8f8c-eec339fc499c\") " pod="openshift-multus/multus-additional-cni-plugins-829dv" Oct 02 12:59:22 crc kubenswrapper[4724]: I1002 12:59:22.067487 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/c67e3c27-01fd-47f5-a7fb-a2ae1e7753d4-multus-cni-dir\") pod \"multus-pr276\" (UID: \"c67e3c27-01fd-47f5-a7fb-a2ae1e7753d4\") " pod="openshift-multus/multus-pr276" Oct 02 12:59:22 crc kubenswrapper[4724]: I1002 12:59:22.067573 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/b90b9e02-3565-4ad5-8f8c-eec339fc499c-system-cni-dir\") pod \"multus-additional-cni-plugins-829dv\" (UID: \"b90b9e02-3565-4ad5-8f8c-eec339fc499c\") " pod="openshift-multus/multus-additional-cni-plugins-829dv" Oct 02 12:59:22 crc kubenswrapper[4724]: I1002 12:59:22.067613 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/4089ad23-969c-4222-a8ed-e141ec291e80-host-kubelet\") pod \"ovnkube-node-w58lt\" (UID: \"4089ad23-969c-4222-a8ed-e141ec291e80\") " pod="openshift-ovn-kubernetes/ovnkube-node-w58lt" Oct 02 12:59:22 crc kubenswrapper[4724]: I1002 12:59:22.067637 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/4089ad23-969c-4222-a8ed-e141ec291e80-log-socket\") pod \"ovnkube-node-w58lt\" (UID: \"4089ad23-969c-4222-a8ed-e141ec291e80\") " pod="openshift-ovn-kubernetes/ovnkube-node-w58lt" Oct 02 12:59:22 crc kubenswrapper[4724]: I1002 12:59:22.067660 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4089ad23-969c-4222-a8ed-e141ec291e80-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-w58lt\" (UID: \"4089ad23-969c-4222-a8ed-e141ec291e80\") " pod="openshift-ovn-kubernetes/ovnkube-node-w58lt" Oct 02 12:59:22 crc kubenswrapper[4724]: I1002 12:59:22.067691 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/4089ad23-969c-4222-a8ed-e141ec291e80-ovnkube-script-lib\") pod \"ovnkube-node-w58lt\" (UID: \"4089ad23-969c-4222-a8ed-e141ec291e80\") " pod="openshift-ovn-kubernetes/ovnkube-node-w58lt" Oct 02 12:59:22 crc kubenswrapper[4724]: I1002 12:59:22.067713 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/c67e3c27-01fd-47f5-a7fb-a2ae1e7753d4-host-var-lib-cni-bin\") pod \"multus-pr276\" (UID: \"c67e3c27-01fd-47f5-a7fb-a2ae1e7753d4\") " pod="openshift-multus/multus-pr276" Oct 02 12:59:22 crc kubenswrapper[4724]: I1002 12:59:22.067735 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c67e3c27-01fd-47f5-a7fb-a2ae1e7753d4-etc-kubernetes\") pod \"multus-pr276\" (UID: \"c67e3c27-01fd-47f5-a7fb-a2ae1e7753d4\") " pod="openshift-multus/multus-pr276" Oct 02 12:59:22 crc kubenswrapper[4724]: I1002 12:59:22.067755 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/4089ad23-969c-4222-a8ed-e141ec291e80-node-log\") pod \"ovnkube-node-w58lt\" (UID: \"4089ad23-969c-4222-a8ed-e141ec291e80\") " pod="openshift-ovn-kubernetes/ovnkube-node-w58lt" Oct 02 12:59:22 crc kubenswrapper[4724]: I1002 12:59:22.067779 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/4089ad23-969c-4222-a8ed-e141ec291e80-ovn-node-metrics-cert\") pod \"ovnkube-node-w58lt\" (UID: \"4089ad23-969c-4222-a8ed-e141ec291e80\") " pod="openshift-ovn-kubernetes/ovnkube-node-w58lt" Oct 02 12:59:22 crc kubenswrapper[4724]: I1002 12:59:22.067808 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vjwcx\" (UniqueName: \"kubernetes.io/projected/b90b9e02-3565-4ad5-8f8c-eec339fc499c-kube-api-access-vjwcx\") pod \"multus-additional-cni-plugins-829dv\" (UID: \"b90b9e02-3565-4ad5-8f8c-eec339fc499c\") " pod="openshift-multus/multus-additional-cni-plugins-829dv" Oct 02 12:59:22 crc kubenswrapper[4724]: I1002 12:59:22.067848 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c844q\" (UniqueName: \"kubernetes.io/projected/c67e3c27-01fd-47f5-a7fb-a2ae1e7753d4-kube-api-access-c844q\") pod \"multus-pr276\" (UID: \"c67e3c27-01fd-47f5-a7fb-a2ae1e7753d4\") " pod="openshift-multus/multus-pr276" Oct 02 12:59:22 crc kubenswrapper[4724]: I1002 12:59:22.067944 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/b90b9e02-3565-4ad5-8f8c-eec339fc499c-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-829dv\" (UID: \"b90b9e02-3565-4ad5-8f8c-eec339fc499c\") " pod="openshift-multus/multus-additional-cni-plugins-829dv" Oct 02 12:59:22 crc kubenswrapper[4724]: I1002 12:59:22.067998 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/4089ad23-969c-4222-a8ed-e141ec291e80-host-slash\") pod \"ovnkube-node-w58lt\" (UID: \"4089ad23-969c-4222-a8ed-e141ec291e80\") " pod="openshift-ovn-kubernetes/ovnkube-node-w58lt" Oct 02 12:59:22 crc kubenswrapper[4724]: I1002 12:59:22.068051 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/4089ad23-969c-4222-a8ed-e141ec291e80-host-run-netns\") pod \"ovnkube-node-w58lt\" (UID: \"4089ad23-969c-4222-a8ed-e141ec291e80\") " pod="openshift-ovn-kubernetes/ovnkube-node-w58lt" Oct 02 12:59:22 crc kubenswrapper[4724]: I1002 12:59:22.068090 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4089ad23-969c-4222-a8ed-e141ec291e80-var-lib-openvswitch\") pod \"ovnkube-node-w58lt\" (UID: \"4089ad23-969c-4222-a8ed-e141ec291e80\") " pod="openshift-ovn-kubernetes/ovnkube-node-w58lt" Oct 02 12:59:22 crc kubenswrapper[4724]: I1002 12:59:22.068112 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4089ad23-969c-4222-a8ed-e141ec291e80-host-run-ovn-kubernetes\") pod \"ovnkube-node-w58lt\" (UID: \"4089ad23-969c-4222-a8ed-e141ec291e80\") " pod="openshift-ovn-kubernetes/ovnkube-node-w58lt" Oct 02 12:59:22 crc kubenswrapper[4724]: I1002 12:59:22.068130 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/c67e3c27-01fd-47f5-a7fb-a2ae1e7753d4-host-run-netns\") pod \"multus-pr276\" (UID: \"c67e3c27-01fd-47f5-a7fb-a2ae1e7753d4\") " pod="openshift-multus/multus-pr276" Oct 02 12:59:22 crc kubenswrapper[4724]: I1002 12:59:22.068148 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/c67e3c27-01fd-47f5-a7fb-a2ae1e7753d4-hostroot\") pod \"multus-pr276\" (UID: \"c67e3c27-01fd-47f5-a7fb-a2ae1e7753d4\") " pod="openshift-multus/multus-pr276" Oct 02 12:59:22 crc kubenswrapper[4724]: I1002 12:59:22.068207 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/4089ad23-969c-4222-a8ed-e141ec291e80-systemd-units\") pod \"ovnkube-node-w58lt\" (UID: \"4089ad23-969c-4222-a8ed-e141ec291e80\") " pod="openshift-ovn-kubernetes/ovnkube-node-w58lt" Oct 02 12:59:22 crc kubenswrapper[4724]: I1002 12:59:22.068235 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/4089ad23-969c-4222-a8ed-e141ec291e80-ovnkube-config\") pod \"ovnkube-node-w58lt\" (UID: \"4089ad23-969c-4222-a8ed-e141ec291e80\") " pod="openshift-ovn-kubernetes/ovnkube-node-w58lt" Oct 02 12:59:22 crc kubenswrapper[4724]: I1002 12:59:22.068265 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/c67e3c27-01fd-47f5-a7fb-a2ae1e7753d4-system-cni-dir\") pod \"multus-pr276\" (UID: \"c67e3c27-01fd-47f5-a7fb-a2ae1e7753d4\") " pod="openshift-multus/multus-pr276" Oct 02 12:59:22 crc kubenswrapper[4724]: I1002 12:59:22.068285 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/c67e3c27-01fd-47f5-a7fb-a2ae1e7753d4-cnibin\") pod \"multus-pr276\" (UID: \"c67e3c27-01fd-47f5-a7fb-a2ae1e7753d4\") " pod="openshift-multus/multus-pr276" Oct 02 12:59:22 crc kubenswrapper[4724]: I1002 12:59:22.068313 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/4089ad23-969c-4222-a8ed-e141ec291e80-run-systemd\") pod \"ovnkube-node-w58lt\" (UID: \"4089ad23-969c-4222-a8ed-e141ec291e80\") " pod="openshift-ovn-kubernetes/ovnkube-node-w58lt" Oct 02 12:59:22 crc kubenswrapper[4724]: I1002 12:59:22.068328 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/4089ad23-969c-4222-a8ed-e141ec291e80-host-cni-bin\") pod \"ovnkube-node-w58lt\" (UID: \"4089ad23-969c-4222-a8ed-e141ec291e80\") " pod="openshift-ovn-kubernetes/ovnkube-node-w58lt" Oct 02 12:59:22 crc kubenswrapper[4724]: I1002 12:59:22.082097 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:19Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:22Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:22 crc kubenswrapper[4724]: I1002 12:59:22.097350 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-pr276" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c67e3c27-01fd-47f5-a7fb-a2ae1e7753d4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c844q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T12:59:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-pr276\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:22Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:22 crc kubenswrapper[4724]: I1002 12:59:22.119482 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8902618f9e9502ff68d0658e199abb68025c551774c41022ed9f3f15dcccba47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:22Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:22 crc kubenswrapper[4724]: I1002 12:59:22.132793 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:22Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:22 crc kubenswrapper[4724]: I1002 12:59:22.148912 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-2mrjk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e0d209f5-ad55-48f5-b4de-51aa5a972c19\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:21Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:21Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8f48\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T12:59:21Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-2mrjk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:22Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:22 crc kubenswrapper[4724]: I1002 12:59:22.163042 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:19Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:22Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:22 crc kubenswrapper[4724]: I1002 12:59:22.169514 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/b90b9e02-3565-4ad5-8f8c-eec339fc499c-cni-binary-copy\") pod \"multus-additional-cni-plugins-829dv\" (UID: \"b90b9e02-3565-4ad5-8f8c-eec339fc499c\") " pod="openshift-multus/multus-additional-cni-plugins-829dv" Oct 02 12:59:22 crc kubenswrapper[4724]: I1002 12:59:22.169601 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4089ad23-969c-4222-a8ed-e141ec291e80-etc-openvswitch\") pod \"ovnkube-node-w58lt\" (UID: \"4089ad23-969c-4222-a8ed-e141ec291e80\") " pod="openshift-ovn-kubernetes/ovnkube-node-w58lt" Oct 02 12:59:22 crc kubenswrapper[4724]: I1002 12:59:22.169634 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4089ad23-969c-4222-a8ed-e141ec291e80-run-openvswitch\") pod \"ovnkube-node-w58lt\" (UID: \"4089ad23-969c-4222-a8ed-e141ec291e80\") " pod="openshift-ovn-kubernetes/ovnkube-node-w58lt" Oct 02 12:59:22 crc kubenswrapper[4724]: I1002 12:59:22.169661 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/c67e3c27-01fd-47f5-a7fb-a2ae1e7753d4-host-run-multus-certs\") pod \"multus-pr276\" (UID: \"c67e3c27-01fd-47f5-a7fb-a2ae1e7753d4\") " pod="openshift-multus/multus-pr276" Oct 02 12:59:22 crc kubenswrapper[4724]: I1002 12:59:22.169685 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/f6090eaa-c182-4788-950c-16352c271233-proxy-tls\") pod \"machine-config-daemon-74k4t\" (UID: \"f6090eaa-c182-4788-950c-16352c271233\") " pod="openshift-machine-config-operator/machine-config-daemon-74k4t" Oct 02 12:59:22 crc kubenswrapper[4724]: I1002 12:59:22.169709 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/c67e3c27-01fd-47f5-a7fb-a2ae1e7753d4-multus-cni-dir\") pod \"multus-pr276\" (UID: \"c67e3c27-01fd-47f5-a7fb-a2ae1e7753d4\") " pod="openshift-multus/multus-pr276" Oct 02 12:59:22 crc kubenswrapper[4724]: I1002 12:59:22.169734 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/b90b9e02-3565-4ad5-8f8c-eec339fc499c-system-cni-dir\") pod \"multus-additional-cni-plugins-829dv\" (UID: \"b90b9e02-3565-4ad5-8f8c-eec339fc499c\") " pod="openshift-multus/multus-additional-cni-plugins-829dv" Oct 02 12:59:22 crc kubenswrapper[4724]: I1002 12:59:22.169759 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/4089ad23-969c-4222-a8ed-e141ec291e80-host-kubelet\") pod \"ovnkube-node-w58lt\" (UID: \"4089ad23-969c-4222-a8ed-e141ec291e80\") " pod="openshift-ovn-kubernetes/ovnkube-node-w58lt" Oct 02 12:59:22 crc kubenswrapper[4724]: I1002 12:59:22.169755 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4089ad23-969c-4222-a8ed-e141ec291e80-etc-openvswitch\") pod \"ovnkube-node-w58lt\" (UID: \"4089ad23-969c-4222-a8ed-e141ec291e80\") " pod="openshift-ovn-kubernetes/ovnkube-node-w58lt" Oct 02 12:59:22 crc kubenswrapper[4724]: I1002 12:59:22.169781 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/4089ad23-969c-4222-a8ed-e141ec291e80-log-socket\") pod \"ovnkube-node-w58lt\" (UID: \"4089ad23-969c-4222-a8ed-e141ec291e80\") " pod="openshift-ovn-kubernetes/ovnkube-node-w58lt" Oct 02 12:59:22 crc kubenswrapper[4724]: I1002 12:59:22.169807 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4089ad23-969c-4222-a8ed-e141ec291e80-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-w58lt\" (UID: \"4089ad23-969c-4222-a8ed-e141ec291e80\") " pod="openshift-ovn-kubernetes/ovnkube-node-w58lt" Oct 02 12:59:22 crc kubenswrapper[4724]: I1002 12:59:22.169836 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/4089ad23-969c-4222-a8ed-e141ec291e80-ovnkube-script-lib\") pod \"ovnkube-node-w58lt\" (UID: \"4089ad23-969c-4222-a8ed-e141ec291e80\") " pod="openshift-ovn-kubernetes/ovnkube-node-w58lt" Oct 02 12:59:22 crc kubenswrapper[4724]: I1002 12:59:22.169899 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/c67e3c27-01fd-47f5-a7fb-a2ae1e7753d4-host-var-lib-cni-bin\") pod \"multus-pr276\" (UID: \"c67e3c27-01fd-47f5-a7fb-a2ae1e7753d4\") " pod="openshift-multus/multus-pr276" Oct 02 12:59:22 crc kubenswrapper[4724]: I1002 12:59:22.169907 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4089ad23-969c-4222-a8ed-e141ec291e80-run-openvswitch\") pod \"ovnkube-node-w58lt\" (UID: \"4089ad23-969c-4222-a8ed-e141ec291e80\") " pod="openshift-ovn-kubernetes/ovnkube-node-w58lt" Oct 02 12:59:22 crc kubenswrapper[4724]: I1002 12:59:22.169921 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c67e3c27-01fd-47f5-a7fb-a2ae1e7753d4-etc-kubernetes\") pod \"multus-pr276\" (UID: \"c67e3c27-01fd-47f5-a7fb-a2ae1e7753d4\") " pod="openshift-multus/multus-pr276" Oct 02 12:59:22 crc kubenswrapper[4724]: I1002 12:59:22.169940 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/4089ad23-969c-4222-a8ed-e141ec291e80-node-log\") pod \"ovnkube-node-w58lt\" (UID: \"4089ad23-969c-4222-a8ed-e141ec291e80\") " pod="openshift-ovn-kubernetes/ovnkube-node-w58lt" Oct 02 12:59:22 crc kubenswrapper[4724]: I1002 12:59:22.169950 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/c67e3c27-01fd-47f5-a7fb-a2ae1e7753d4-host-run-multus-certs\") pod \"multus-pr276\" (UID: \"c67e3c27-01fd-47f5-a7fb-a2ae1e7753d4\") " pod="openshift-multus/multus-pr276" Oct 02 12:59:22 crc kubenswrapper[4724]: I1002 12:59:22.169962 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/4089ad23-969c-4222-a8ed-e141ec291e80-ovn-node-metrics-cert\") pod \"ovnkube-node-w58lt\" (UID: \"4089ad23-969c-4222-a8ed-e141ec291e80\") " pod="openshift-ovn-kubernetes/ovnkube-node-w58lt" Oct 02 12:59:22 crc kubenswrapper[4724]: I1002 12:59:22.169995 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/b90b9e02-3565-4ad5-8f8c-eec339fc499c-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-829dv\" (UID: \"b90b9e02-3565-4ad5-8f8c-eec339fc499c\") " pod="openshift-multus/multus-additional-cni-plugins-829dv" Oct 02 12:59:22 crc kubenswrapper[4724]: I1002 12:59:22.170019 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vjwcx\" (UniqueName: \"kubernetes.io/projected/b90b9e02-3565-4ad5-8f8c-eec339fc499c-kube-api-access-vjwcx\") pod \"multus-additional-cni-plugins-829dv\" (UID: \"b90b9e02-3565-4ad5-8f8c-eec339fc499c\") " pod="openshift-multus/multus-additional-cni-plugins-829dv" Oct 02 12:59:22 crc kubenswrapper[4724]: I1002 12:59:22.170038 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c844q\" (UniqueName: \"kubernetes.io/projected/c67e3c27-01fd-47f5-a7fb-a2ae1e7753d4-kube-api-access-c844q\") pod \"multus-pr276\" (UID: \"c67e3c27-01fd-47f5-a7fb-a2ae1e7753d4\") " pod="openshift-multus/multus-pr276" Oct 02 12:59:22 crc kubenswrapper[4724]: I1002 12:59:22.170057 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/4089ad23-969c-4222-a8ed-e141ec291e80-host-slash\") pod \"ovnkube-node-w58lt\" (UID: \"4089ad23-969c-4222-a8ed-e141ec291e80\") " pod="openshift-ovn-kubernetes/ovnkube-node-w58lt" Oct 02 12:59:22 crc kubenswrapper[4724]: I1002 12:59:22.170080 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/4089ad23-969c-4222-a8ed-e141ec291e80-host-run-netns\") pod \"ovnkube-node-w58lt\" (UID: \"4089ad23-969c-4222-a8ed-e141ec291e80\") " pod="openshift-ovn-kubernetes/ovnkube-node-w58lt" Oct 02 12:59:22 crc kubenswrapper[4724]: I1002 12:59:22.170106 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4089ad23-969c-4222-a8ed-e141ec291e80-var-lib-openvswitch\") pod \"ovnkube-node-w58lt\" (UID: \"4089ad23-969c-4222-a8ed-e141ec291e80\") " pod="openshift-ovn-kubernetes/ovnkube-node-w58lt" Oct 02 12:59:22 crc kubenswrapper[4724]: I1002 12:59:22.170129 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/4089ad23-969c-4222-a8ed-e141ec291e80-systemd-units\") pod \"ovnkube-node-w58lt\" (UID: \"4089ad23-969c-4222-a8ed-e141ec291e80\") " pod="openshift-ovn-kubernetes/ovnkube-node-w58lt" Oct 02 12:59:22 crc kubenswrapper[4724]: I1002 12:59:22.170150 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4089ad23-969c-4222-a8ed-e141ec291e80-host-run-ovn-kubernetes\") pod \"ovnkube-node-w58lt\" (UID: \"4089ad23-969c-4222-a8ed-e141ec291e80\") " pod="openshift-ovn-kubernetes/ovnkube-node-w58lt" Oct 02 12:59:22 crc kubenswrapper[4724]: I1002 12:59:22.170175 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/c67e3c27-01fd-47f5-a7fb-a2ae1e7753d4-host-run-netns\") pod \"multus-pr276\" (UID: \"c67e3c27-01fd-47f5-a7fb-a2ae1e7753d4\") " pod="openshift-multus/multus-pr276" Oct 02 12:59:22 crc kubenswrapper[4724]: I1002 12:59:22.170196 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/c67e3c27-01fd-47f5-a7fb-a2ae1e7753d4-hostroot\") pod \"multus-pr276\" (UID: \"c67e3c27-01fd-47f5-a7fb-a2ae1e7753d4\") " pod="openshift-multus/multus-pr276" Oct 02 12:59:22 crc kubenswrapper[4724]: I1002 12:59:22.170216 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/4089ad23-969c-4222-a8ed-e141ec291e80-run-systemd\") pod \"ovnkube-node-w58lt\" (UID: \"4089ad23-969c-4222-a8ed-e141ec291e80\") " pod="openshift-ovn-kubernetes/ovnkube-node-w58lt" Oct 02 12:59:22 crc kubenswrapper[4724]: I1002 12:59:22.170238 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/4089ad23-969c-4222-a8ed-e141ec291e80-host-cni-bin\") pod \"ovnkube-node-w58lt\" (UID: \"4089ad23-969c-4222-a8ed-e141ec291e80\") " pod="openshift-ovn-kubernetes/ovnkube-node-w58lt" Oct 02 12:59:22 crc kubenswrapper[4724]: I1002 12:59:22.170259 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/4089ad23-969c-4222-a8ed-e141ec291e80-ovnkube-config\") pod \"ovnkube-node-w58lt\" (UID: \"4089ad23-969c-4222-a8ed-e141ec291e80\") " pod="openshift-ovn-kubernetes/ovnkube-node-w58lt" Oct 02 12:59:22 crc kubenswrapper[4724]: I1002 12:59:22.170282 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/c67e3c27-01fd-47f5-a7fb-a2ae1e7753d4-system-cni-dir\") pod \"multus-pr276\" (UID: \"c67e3c27-01fd-47f5-a7fb-a2ae1e7753d4\") " pod="openshift-multus/multus-pr276" Oct 02 12:59:22 crc kubenswrapper[4724]: I1002 12:59:22.170303 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/c67e3c27-01fd-47f5-a7fb-a2ae1e7753d4-cnibin\") pod \"multus-pr276\" (UID: \"c67e3c27-01fd-47f5-a7fb-a2ae1e7753d4\") " pod="openshift-multus/multus-pr276" Oct 02 12:59:22 crc kubenswrapper[4724]: I1002 12:59:22.170327 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/c67e3c27-01fd-47f5-a7fb-a2ae1e7753d4-multus-socket-dir-parent\") pod \"multus-pr276\" (UID: \"c67e3c27-01fd-47f5-a7fb-a2ae1e7753d4\") " pod="openshift-multus/multus-pr276" Oct 02 12:59:22 crc kubenswrapper[4724]: I1002 12:59:22.170348 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/b90b9e02-3565-4ad5-8f8c-eec339fc499c-cni-binary-copy\") pod \"multus-additional-cni-plugins-829dv\" (UID: \"b90b9e02-3565-4ad5-8f8c-eec339fc499c\") " pod="openshift-multus/multus-additional-cni-plugins-829dv" Oct 02 12:59:22 crc kubenswrapper[4724]: I1002 12:59:22.170394 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/b90b9e02-3565-4ad5-8f8c-eec339fc499c-cnibin\") pod \"multus-additional-cni-plugins-829dv\" (UID: \"b90b9e02-3565-4ad5-8f8c-eec339fc499c\") " pod="openshift-multus/multus-additional-cni-plugins-829dv" Oct 02 12:59:22 crc kubenswrapper[4724]: I1002 12:59:22.170362 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/b90b9e02-3565-4ad5-8f8c-eec339fc499c-cnibin\") pod \"multus-additional-cni-plugins-829dv\" (UID: \"b90b9e02-3565-4ad5-8f8c-eec339fc499c\") " pod="openshift-multus/multus-additional-cni-plugins-829dv" Oct 02 12:59:22 crc kubenswrapper[4724]: I1002 12:59:22.170434 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/4089ad23-969c-4222-a8ed-e141ec291e80-host-slash\") pod \"ovnkube-node-w58lt\" (UID: \"4089ad23-969c-4222-a8ed-e141ec291e80\") " pod="openshift-ovn-kubernetes/ovnkube-node-w58lt" Oct 02 12:59:22 crc kubenswrapper[4724]: I1002 12:59:22.170438 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/4089ad23-969c-4222-a8ed-e141ec291e80-env-overrides\") pod \"ovnkube-node-w58lt\" (UID: \"4089ad23-969c-4222-a8ed-e141ec291e80\") " pod="openshift-ovn-kubernetes/ovnkube-node-w58lt" Oct 02 12:59:22 crc kubenswrapper[4724]: I1002 12:59:22.170460 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/c67e3c27-01fd-47f5-a7fb-a2ae1e7753d4-multus-conf-dir\") pod \"multus-pr276\" (UID: \"c67e3c27-01fd-47f5-a7fb-a2ae1e7753d4\") " pod="openshift-multus/multus-pr276" Oct 02 12:59:22 crc kubenswrapper[4724]: I1002 12:59:22.170464 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/4089ad23-969c-4222-a8ed-e141ec291e80-host-run-netns\") pod \"ovnkube-node-w58lt\" (UID: \"4089ad23-969c-4222-a8ed-e141ec291e80\") " pod="openshift-ovn-kubernetes/ovnkube-node-w58lt" Oct 02 12:59:22 crc kubenswrapper[4724]: I1002 12:59:22.170479 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/f6090eaa-c182-4788-950c-16352c271233-rootfs\") pod \"machine-config-daemon-74k4t\" (UID: \"f6090eaa-c182-4788-950c-16352c271233\") " pod="openshift-machine-config-operator/machine-config-daemon-74k4t" Oct 02 12:59:22 crc kubenswrapper[4724]: I1002 12:59:22.170498 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4089ad23-969c-4222-a8ed-e141ec291e80-var-lib-openvswitch\") pod \"ovnkube-node-w58lt\" (UID: \"4089ad23-969c-4222-a8ed-e141ec291e80\") " pod="openshift-ovn-kubernetes/ovnkube-node-w58lt" Oct 02 12:59:22 crc kubenswrapper[4724]: I1002 12:59:22.170504 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fb6lr\" (UniqueName: \"kubernetes.io/projected/f6090eaa-c182-4788-950c-16352c271233-kube-api-access-fb6lr\") pod \"machine-config-daemon-74k4t\" (UID: \"f6090eaa-c182-4788-950c-16352c271233\") " pod="openshift-machine-config-operator/machine-config-daemon-74k4t" Oct 02 12:59:22 crc kubenswrapper[4724]: I1002 12:59:22.170529 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/b90b9e02-3565-4ad5-8f8c-eec339fc499c-os-release\") pod \"multus-additional-cni-plugins-829dv\" (UID: \"b90b9e02-3565-4ad5-8f8c-eec339fc499c\") " pod="openshift-multus/multus-additional-cni-plugins-829dv" Oct 02 12:59:22 crc kubenswrapper[4724]: I1002 12:59:22.170569 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/4089ad23-969c-4222-a8ed-e141ec291e80-run-ovn\") pod \"ovnkube-node-w58lt\" (UID: \"4089ad23-969c-4222-a8ed-e141ec291e80\") " pod="openshift-ovn-kubernetes/ovnkube-node-w58lt" Oct 02 12:59:22 crc kubenswrapper[4724]: I1002 12:59:22.170576 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4089ad23-969c-4222-a8ed-e141ec291e80-host-run-ovn-kubernetes\") pod \"ovnkube-node-w58lt\" (UID: \"4089ad23-969c-4222-a8ed-e141ec291e80\") " pod="openshift-ovn-kubernetes/ovnkube-node-w58lt" Oct 02 12:59:22 crc kubenswrapper[4724]: I1002 12:59:22.170588 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/c67e3c27-01fd-47f5-a7fb-a2ae1e7753d4-os-release\") pod \"multus-pr276\" (UID: \"c67e3c27-01fd-47f5-a7fb-a2ae1e7753d4\") " pod="openshift-multus/multus-pr276" Oct 02 12:59:22 crc kubenswrapper[4724]: I1002 12:59:22.170606 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/c67e3c27-01fd-47f5-a7fb-a2ae1e7753d4-host-run-k8s-cni-cncf-io\") pod \"multus-pr276\" (UID: \"c67e3c27-01fd-47f5-a7fb-a2ae1e7753d4\") " pod="openshift-multus/multus-pr276" Oct 02 12:59:22 crc kubenswrapper[4724]: I1002 12:59:22.170614 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/c67e3c27-01fd-47f5-a7fb-a2ae1e7753d4-host-run-netns\") pod \"multus-pr276\" (UID: \"c67e3c27-01fd-47f5-a7fb-a2ae1e7753d4\") " pod="openshift-multus/multus-pr276" Oct 02 12:59:22 crc kubenswrapper[4724]: I1002 12:59:22.170623 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/c67e3c27-01fd-47f5-a7fb-a2ae1e7753d4-host-var-lib-kubelet\") pod \"multus-pr276\" (UID: \"c67e3c27-01fd-47f5-a7fb-a2ae1e7753d4\") " pod="openshift-multus/multus-pr276" Oct 02 12:59:22 crc kubenswrapper[4724]: I1002 12:59:22.170643 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/c67e3c27-01fd-47f5-a7fb-a2ae1e7753d4-multus-daemon-config\") pod \"multus-pr276\" (UID: \"c67e3c27-01fd-47f5-a7fb-a2ae1e7753d4\") " pod="openshift-multus/multus-pr276" Oct 02 12:59:22 crc kubenswrapper[4724]: I1002 12:59:22.170650 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/c67e3c27-01fd-47f5-a7fb-a2ae1e7753d4-hostroot\") pod \"multus-pr276\" (UID: \"c67e3c27-01fd-47f5-a7fb-a2ae1e7753d4\") " pod="openshift-multus/multus-pr276" Oct 02 12:59:22 crc kubenswrapper[4724]: I1002 12:59:22.170665 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7sxqx\" (UniqueName: \"kubernetes.io/projected/4089ad23-969c-4222-a8ed-e141ec291e80-kube-api-access-7sxqx\") pod \"ovnkube-node-w58lt\" (UID: \"4089ad23-969c-4222-a8ed-e141ec291e80\") " pod="openshift-ovn-kubernetes/ovnkube-node-w58lt" Oct 02 12:59:22 crc kubenswrapper[4724]: I1002 12:59:22.170681 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/4089ad23-969c-4222-a8ed-e141ec291e80-run-systemd\") pod \"ovnkube-node-w58lt\" (UID: \"4089ad23-969c-4222-a8ed-e141ec291e80\") " pod="openshift-ovn-kubernetes/ovnkube-node-w58lt" Oct 02 12:59:22 crc kubenswrapper[4724]: I1002 12:59:22.170687 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/c67e3c27-01fd-47f5-a7fb-a2ae1e7753d4-cni-binary-copy\") pod \"multus-pr276\" (UID: \"c67e3c27-01fd-47f5-a7fb-a2ae1e7753d4\") " pod="openshift-multus/multus-pr276" Oct 02 12:59:22 crc kubenswrapper[4724]: I1002 12:59:22.170739 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/b90b9e02-3565-4ad5-8f8c-eec339fc499c-tuning-conf-dir\") pod \"multus-additional-cni-plugins-829dv\" (UID: \"b90b9e02-3565-4ad5-8f8c-eec339fc499c\") " pod="openshift-multus/multus-additional-cni-plugins-829dv" Oct 02 12:59:22 crc kubenswrapper[4724]: I1002 12:59:22.169873 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/c67e3c27-01fd-47f5-a7fb-a2ae1e7753d4-multus-cni-dir\") pod \"multus-pr276\" (UID: \"c67e3c27-01fd-47f5-a7fb-a2ae1e7753d4\") " pod="openshift-multus/multus-pr276" Oct 02 12:59:22 crc kubenswrapper[4724]: I1002 12:59:22.170762 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/4089ad23-969c-4222-a8ed-e141ec291e80-host-cni-netd\") pod \"ovnkube-node-w58lt\" (UID: \"4089ad23-969c-4222-a8ed-e141ec291e80\") " pod="openshift-ovn-kubernetes/ovnkube-node-w58lt" Oct 02 12:59:22 crc kubenswrapper[4724]: I1002 12:59:22.170814 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/c67e3c27-01fd-47f5-a7fb-a2ae1e7753d4-host-run-k8s-cni-cncf-io\") pod \"multus-pr276\" (UID: \"c67e3c27-01fd-47f5-a7fb-a2ae1e7753d4\") " pod="openshift-multus/multus-pr276" Oct 02 12:59:22 crc kubenswrapper[4724]: I1002 12:59:22.170823 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/c67e3c27-01fd-47f5-a7fb-a2ae1e7753d4-multus-conf-dir\") pod \"multus-pr276\" (UID: \"c67e3c27-01fd-47f5-a7fb-a2ae1e7753d4\") " pod="openshift-multus/multus-pr276" Oct 02 12:59:22 crc kubenswrapper[4724]: I1002 12:59:22.170877 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/c67e3c27-01fd-47f5-a7fb-a2ae1e7753d4-host-var-lib-cni-bin\") pod \"multus-pr276\" (UID: \"c67e3c27-01fd-47f5-a7fb-a2ae1e7753d4\") " pod="openshift-multus/multus-pr276" Oct 02 12:59:22 crc kubenswrapper[4724]: I1002 12:59:22.170951 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c67e3c27-01fd-47f5-a7fb-a2ae1e7753d4-etc-kubernetes\") pod \"multus-pr276\" (UID: \"c67e3c27-01fd-47f5-a7fb-a2ae1e7753d4\") " pod="openshift-multus/multus-pr276" Oct 02 12:59:22 crc kubenswrapper[4724]: I1002 12:59:22.170971 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/c67e3c27-01fd-47f5-a7fb-a2ae1e7753d4-os-release\") pod \"multus-pr276\" (UID: \"c67e3c27-01fd-47f5-a7fb-a2ae1e7753d4\") " pod="openshift-multus/multus-pr276" Oct 02 12:59:22 crc kubenswrapper[4724]: I1002 12:59:22.170981 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/4089ad23-969c-4222-a8ed-e141ec291e80-ovnkube-script-lib\") pod \"ovnkube-node-w58lt\" (UID: \"4089ad23-969c-4222-a8ed-e141ec291e80\") " pod="openshift-ovn-kubernetes/ovnkube-node-w58lt" Oct 02 12:59:22 crc kubenswrapper[4724]: I1002 12:59:22.170978 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/f6090eaa-c182-4788-950c-16352c271233-rootfs\") pod \"machine-config-daemon-74k4t\" (UID: \"f6090eaa-c182-4788-950c-16352c271233\") " pod="openshift-machine-config-operator/machine-config-daemon-74k4t" Oct 02 12:59:22 crc kubenswrapper[4724]: I1002 12:59:22.171034 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/b90b9e02-3565-4ad5-8f8c-eec339fc499c-system-cni-dir\") pod \"multus-additional-cni-plugins-829dv\" (UID: \"b90b9e02-3565-4ad5-8f8c-eec339fc499c\") " pod="openshift-multus/multus-additional-cni-plugins-829dv" Oct 02 12:59:22 crc kubenswrapper[4724]: I1002 12:59:22.171063 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/c67e3c27-01fd-47f5-a7fb-a2ae1e7753d4-cnibin\") pod \"multus-pr276\" (UID: \"c67e3c27-01fd-47f5-a7fb-a2ae1e7753d4\") " pod="openshift-multus/multus-pr276" Oct 02 12:59:22 crc kubenswrapper[4724]: I1002 12:59:22.171049 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/4089ad23-969c-4222-a8ed-e141ec291e80-run-ovn\") pod \"ovnkube-node-w58lt\" (UID: \"4089ad23-969c-4222-a8ed-e141ec291e80\") " pod="openshift-ovn-kubernetes/ovnkube-node-w58lt" Oct 02 12:59:22 crc kubenswrapper[4724]: I1002 12:59:22.170529 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/4089ad23-969c-4222-a8ed-e141ec291e80-systemd-units\") pod \"ovnkube-node-w58lt\" (UID: \"4089ad23-969c-4222-a8ed-e141ec291e80\") " pod="openshift-ovn-kubernetes/ovnkube-node-w58lt" Oct 02 12:59:22 crc kubenswrapper[4724]: I1002 12:59:22.171105 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4089ad23-969c-4222-a8ed-e141ec291e80-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-w58lt\" (UID: \"4089ad23-969c-4222-a8ed-e141ec291e80\") " pod="openshift-ovn-kubernetes/ovnkube-node-w58lt" Oct 02 12:59:22 crc kubenswrapper[4724]: I1002 12:59:22.171178 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/4089ad23-969c-4222-a8ed-e141ec291e80-host-cni-bin\") pod \"ovnkube-node-w58lt\" (UID: \"4089ad23-969c-4222-a8ed-e141ec291e80\") " pod="openshift-ovn-kubernetes/ovnkube-node-w58lt" Oct 02 12:59:22 crc kubenswrapper[4724]: I1002 12:59:22.171245 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/c67e3c27-01fd-47f5-a7fb-a2ae1e7753d4-system-cni-dir\") pod \"multus-pr276\" (UID: \"c67e3c27-01fd-47f5-a7fb-a2ae1e7753d4\") " pod="openshift-multus/multus-pr276" Oct 02 12:59:22 crc kubenswrapper[4724]: I1002 12:59:22.171288 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/c67e3c27-01fd-47f5-a7fb-a2ae1e7753d4-cni-binary-copy\") pod \"multus-pr276\" (UID: \"c67e3c27-01fd-47f5-a7fb-a2ae1e7753d4\") " pod="openshift-multus/multus-pr276" Oct 02 12:59:22 crc kubenswrapper[4724]: I1002 12:59:22.171289 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/4089ad23-969c-4222-a8ed-e141ec291e80-log-socket\") pod \"ovnkube-node-w58lt\" (UID: \"4089ad23-969c-4222-a8ed-e141ec291e80\") " pod="openshift-ovn-kubernetes/ovnkube-node-w58lt" Oct 02 12:59:22 crc kubenswrapper[4724]: I1002 12:59:22.171389 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/c67e3c27-01fd-47f5-a7fb-a2ae1e7753d4-host-var-lib-kubelet\") pod \"multus-pr276\" (UID: \"c67e3c27-01fd-47f5-a7fb-a2ae1e7753d4\") " pod="openshift-multus/multus-pr276" Oct 02 12:59:22 crc kubenswrapper[4724]: I1002 12:59:22.170981 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/4089ad23-969c-4222-a8ed-e141ec291e80-host-kubelet\") pod \"ovnkube-node-w58lt\" (UID: \"4089ad23-969c-4222-a8ed-e141ec291e80\") " pod="openshift-ovn-kubernetes/ovnkube-node-w58lt" Oct 02 12:59:22 crc kubenswrapper[4724]: I1002 12:59:22.171522 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/b90b9e02-3565-4ad5-8f8c-eec339fc499c-os-release\") pod \"multus-additional-cni-plugins-829dv\" (UID: \"b90b9e02-3565-4ad5-8f8c-eec339fc499c\") " pod="openshift-multus/multus-additional-cni-plugins-829dv" Oct 02 12:59:22 crc kubenswrapper[4724]: I1002 12:59:22.170996 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/4089ad23-969c-4222-a8ed-e141ec291e80-node-log\") pod \"ovnkube-node-w58lt\" (UID: \"4089ad23-969c-4222-a8ed-e141ec291e80\") " pod="openshift-ovn-kubernetes/ovnkube-node-w58lt" Oct 02 12:59:22 crc kubenswrapper[4724]: I1002 12:59:22.171554 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/c67e3c27-01fd-47f5-a7fb-a2ae1e7753d4-multus-socket-dir-parent\") pod \"multus-pr276\" (UID: \"c67e3c27-01fd-47f5-a7fb-a2ae1e7753d4\") " pod="openshift-multus/multus-pr276" Oct 02 12:59:22 crc kubenswrapper[4724]: I1002 12:59:22.171617 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/4089ad23-969c-4222-a8ed-e141ec291e80-host-cni-netd\") pod \"ovnkube-node-w58lt\" (UID: \"4089ad23-969c-4222-a8ed-e141ec291e80\") " pod="openshift-ovn-kubernetes/ovnkube-node-w58lt" Oct 02 12:59:22 crc kubenswrapper[4724]: I1002 12:59:22.171657 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/4089ad23-969c-4222-a8ed-e141ec291e80-env-overrides\") pod \"ovnkube-node-w58lt\" (UID: \"4089ad23-969c-4222-a8ed-e141ec291e80\") " pod="openshift-ovn-kubernetes/ovnkube-node-w58lt" Oct 02 12:59:22 crc kubenswrapper[4724]: I1002 12:59:22.171692 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/c67e3c27-01fd-47f5-a7fb-a2ae1e7753d4-host-var-lib-cni-multus\") pod \"multus-pr276\" (UID: \"c67e3c27-01fd-47f5-a7fb-a2ae1e7753d4\") " pod="openshift-multus/multus-pr276" Oct 02 12:59:22 crc kubenswrapper[4724]: I1002 12:59:22.171713 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/c67e3c27-01fd-47f5-a7fb-a2ae1e7753d4-host-var-lib-cni-multus\") pod \"multus-pr276\" (UID: \"c67e3c27-01fd-47f5-a7fb-a2ae1e7753d4\") " pod="openshift-multus/multus-pr276" Oct 02 12:59:22 crc kubenswrapper[4724]: I1002 12:59:22.171761 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/f6090eaa-c182-4788-950c-16352c271233-mcd-auth-proxy-config\") pod \"machine-config-daemon-74k4t\" (UID: \"f6090eaa-c182-4788-950c-16352c271233\") " pod="openshift-machine-config-operator/machine-config-daemon-74k4t" Oct 02 12:59:22 crc kubenswrapper[4724]: I1002 12:59:22.171797 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/b90b9e02-3565-4ad5-8f8c-eec339fc499c-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-829dv\" (UID: \"b90b9e02-3565-4ad5-8f8c-eec339fc499c\") " pod="openshift-multus/multus-additional-cni-plugins-829dv" Oct 02 12:59:22 crc kubenswrapper[4724]: I1002 12:59:22.172017 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/4089ad23-969c-4222-a8ed-e141ec291e80-ovnkube-config\") pod \"ovnkube-node-w58lt\" (UID: \"4089ad23-969c-4222-a8ed-e141ec291e80\") " pod="openshift-ovn-kubernetes/ovnkube-node-w58lt" Oct 02 12:59:22 crc kubenswrapper[4724]: I1002 12:59:22.172068 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/c67e3c27-01fd-47f5-a7fb-a2ae1e7753d4-multus-daemon-config\") pod \"multus-pr276\" (UID: \"c67e3c27-01fd-47f5-a7fb-a2ae1e7753d4\") " pod="openshift-multus/multus-pr276" Oct 02 12:59:22 crc kubenswrapper[4724]: I1002 12:59:22.172383 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/f6090eaa-c182-4788-950c-16352c271233-mcd-auth-proxy-config\") pod \"machine-config-daemon-74k4t\" (UID: \"f6090eaa-c182-4788-950c-16352c271233\") " pod="openshift-machine-config-operator/machine-config-daemon-74k4t" Oct 02 12:59:22 crc kubenswrapper[4724]: I1002 12:59:22.174123 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/f6090eaa-c182-4788-950c-16352c271233-proxy-tls\") pod \"machine-config-daemon-74k4t\" (UID: \"f6090eaa-c182-4788-950c-16352c271233\") " pod="openshift-machine-config-operator/machine-config-daemon-74k4t" Oct 02 12:59:22 crc kubenswrapper[4724]: I1002 12:59:22.174289 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/b90b9e02-3565-4ad5-8f8c-eec339fc499c-tuning-conf-dir\") pod \"multus-additional-cni-plugins-829dv\" (UID: \"b90b9e02-3565-4ad5-8f8c-eec339fc499c\") " pod="openshift-multus/multus-additional-cni-plugins-829dv" Oct 02 12:59:22 crc kubenswrapper[4724]: I1002 12:59:22.175475 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/4089ad23-969c-4222-a8ed-e141ec291e80-ovn-node-metrics-cert\") pod \"ovnkube-node-w58lt\" (UID: \"4089ad23-969c-4222-a8ed-e141ec291e80\") " pod="openshift-ovn-kubernetes/ovnkube-node-w58lt" Oct 02 12:59:22 crc kubenswrapper[4724]: I1002 12:59:22.183582 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-829dv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b90b9e02-3565-4ad5-8f8c-eec339fc499c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:21Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vjwcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vjwcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vjwcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vjwcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vjwcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vjwcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vjwcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T12:59:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-829dv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:22Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:22 crc kubenswrapper[4724]: I1002 12:59:22.189818 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fb6lr\" (UniqueName: \"kubernetes.io/projected/f6090eaa-c182-4788-950c-16352c271233-kube-api-access-fb6lr\") pod \"machine-config-daemon-74k4t\" (UID: \"f6090eaa-c182-4788-950c-16352c271233\") " pod="openshift-machine-config-operator/machine-config-daemon-74k4t" Oct 02 12:59:22 crc kubenswrapper[4724]: I1002 12:59:22.190129 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7sxqx\" (UniqueName: \"kubernetes.io/projected/4089ad23-969c-4222-a8ed-e141ec291e80-kube-api-access-7sxqx\") pod \"ovnkube-node-w58lt\" (UID: \"4089ad23-969c-4222-a8ed-e141ec291e80\") " pod="openshift-ovn-kubernetes/ovnkube-node-w58lt" Oct 02 12:59:22 crc kubenswrapper[4724]: I1002 12:59:22.190239 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vjwcx\" (UniqueName: \"kubernetes.io/projected/b90b9e02-3565-4ad5-8f8c-eec339fc499c-kube-api-access-vjwcx\") pod \"multus-additional-cni-plugins-829dv\" (UID: \"b90b9e02-3565-4ad5-8f8c-eec339fc499c\") " pod="openshift-multus/multus-additional-cni-plugins-829dv" Oct 02 12:59:22 crc kubenswrapper[4724]: I1002 12:59:22.191383 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c844q\" (UniqueName: \"kubernetes.io/projected/c67e3c27-01fd-47f5-a7fb-a2ae1e7753d4-kube-api-access-c844q\") pod \"multus-pr276\" (UID: \"c67e3c27-01fd-47f5-a7fb-a2ae1e7753d4\") " pod="openshift-multus/multus-pr276" Oct 02 12:59:22 crc kubenswrapper[4724]: I1002 12:59:22.197657 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-74k4t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6090eaa-c182-4788-950c-16352c271233\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:21Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:21Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fb6lr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fb6lr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T12:59:21Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-74k4t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:22Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:22 crc kubenswrapper[4724]: I1002 12:59:22.208625 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-pr276" Oct 02 12:59:22 crc kubenswrapper[4724]: I1002 12:59:22.213264 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:19Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:22Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:22 crc kubenswrapper[4724]: I1002 12:59:22.216859 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-829dv" Oct 02 12:59:22 crc kubenswrapper[4724]: I1002 12:59:22.224630 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-w58lt" Oct 02 12:59:22 crc kubenswrapper[4724]: I1002 12:59:22.232425 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://16ca55ea7e67fb6219b1c835d960d1f9ec6a18a98789ee961e44a278fad9add7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://72eefb1c1aad15d7ad34aa4384c7d0a20a4c886225cb7d28f0597cf01ad35693\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:22Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:22 crc kubenswrapper[4724]: W1002 12:59:22.233320 4724 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb90b9e02_3565_4ad5_8f8c_eec339fc499c.slice/crio-6fe27a7c7d8df415283cb7d171f0152d6db1446d9c5aed1d2c2b394222a1357d WatchSource:0}: Error finding container 6fe27a7c7d8df415283cb7d171f0152d6db1446d9c5aed1d2c2b394222a1357d: Status 404 returned error can't find the container with id 6fe27a7c7d8df415283cb7d171f0152d6db1446d9c5aed1d2c2b394222a1357d Oct 02 12:59:22 crc kubenswrapper[4724]: I1002 12:59:22.236656 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-74k4t" Oct 02 12:59:22 crc kubenswrapper[4724]: I1002 12:59:22.250206 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1527f3c8-7fa1-404b-aafd-7cac512df49d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:58:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:58:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:58:56Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:58:56Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:58:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f058a4a1cf35b7f4800f872139322f2b0096d67548945ba03ecfe6775ffd5103\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:58:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c06801afff9aa03ae9e0bcd8a966f454d81fdfe876806eed22e9d93132989dae\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:58:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff4c45bdd946a4d50c7dc93752d3d1a71b10b830611b64e138a5e1c0eb9e5e30\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:58:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a3585f00b9d5a19821986d13b0d46f0db3bdd6eef3b4e28f3a63449e0aa0e55\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a3585f00b9d5a19821986d13b0d46f0db3bdd6eef3b4e28f3a63449e0aa0e55\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-02T12:59:19Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1002 12:59:13.975124 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1002 12:59:13.976423 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1894517521/tls.crt::/tmp/serving-cert-1894517521/tls.key\\\\\\\"\\\\nI1002 12:59:19.332110 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1002 12:59:19.335107 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1002 12:59:19.335132 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1002 12:59:19.335161 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1002 12:59:19.335167 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1002 12:59:19.339404 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1002 12:59:19.339433 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 12:59:19.339437 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 12:59:19.339442 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1002 12:59:19.339446 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1002 12:59:19.339452 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1002 12:59:19.339454 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1002 12:59:19.339646 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1002 12:59:19.342508 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T12:59:02Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://25e74bd1660931f7bb8dc824f985c539abcbd4c706a11edf7192876b8796e751\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:58:59Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://366ed0b9a29ba0b7606888087909b51b39abd60dd6b11e6509b8613913461b6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://366ed0b9a29ba0b7606888087909b51b39abd60dd6b11e6509b8613913461b6b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T12:58:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T12:58:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T12:58:56Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:22Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:22 crc kubenswrapper[4724]: I1002 12:59:22.275731 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-w58lt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4089ad23-969c-4222-a8ed-e141ec291e80\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:21Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sxqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sxqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sxqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sxqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sxqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sxqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sxqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sxqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sxqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T12:59:21Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-w58lt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:22Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:22 crc kubenswrapper[4724]: I1002 12:59:22.486284 4724 generic.go:334] "Generic (PLEG): container finished" podID="4089ad23-969c-4222-a8ed-e141ec291e80" containerID="9c7841b20007f0436754e669fb6f1a85de1f0ced1c0b2686ae0a20b56ab878f2" exitCode=0 Oct 02 12:59:22 crc kubenswrapper[4724]: I1002 12:59:22.486359 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-w58lt" event={"ID":"4089ad23-969c-4222-a8ed-e141ec291e80","Type":"ContainerDied","Data":"9c7841b20007f0436754e669fb6f1a85de1f0ced1c0b2686ae0a20b56ab878f2"} Oct 02 12:59:22 crc kubenswrapper[4724]: I1002 12:59:22.486388 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-w58lt" event={"ID":"4089ad23-969c-4222-a8ed-e141ec291e80","Type":"ContainerStarted","Data":"679ff53a498043077c9aa75da8fcbe9b55c82ff0489a5f61898dbbcfc239ed1f"} Oct 02 12:59:22 crc kubenswrapper[4724]: I1002 12:59:22.490926 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-pr276" event={"ID":"c67e3c27-01fd-47f5-a7fb-a2ae1e7753d4","Type":"ContainerStarted","Data":"5fe9713cb9b004911140ef4802a6755a4d2a781fde439d7318ace93e4b608cef"} Oct 02 12:59:22 crc kubenswrapper[4724]: I1002 12:59:22.490969 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-pr276" event={"ID":"c67e3c27-01fd-47f5-a7fb-a2ae1e7753d4","Type":"ContainerStarted","Data":"b4732b54842f46e4c1d55b6f60f511c2b3cc0b19f62e10ad39a09b4f0d719ed3"} Oct 02 12:59:22 crc kubenswrapper[4724]: I1002 12:59:22.496178 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-74k4t" event={"ID":"f6090eaa-c182-4788-950c-16352c271233","Type":"ContainerStarted","Data":"4d5dbca156adf53ca2d3b237e15b6dd4387249e73dbf1a7fb3e7e39c53d1b548"} Oct 02 12:59:22 crc kubenswrapper[4724]: I1002 12:59:22.496223 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-74k4t" event={"ID":"f6090eaa-c182-4788-950c-16352c271233","Type":"ContainerStarted","Data":"eb870674e96c4b4c30079dbf6afbf12cce927c706d49be6dba1f2eae8eedff75"} Oct 02 12:59:22 crc kubenswrapper[4724]: I1002 12:59:22.513147 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-829dv" event={"ID":"b90b9e02-3565-4ad5-8f8c-eec339fc499c","Type":"ContainerStarted","Data":"6fe27a7c7d8df415283cb7d171f0152d6db1446d9c5aed1d2c2b394222a1357d"} Oct 02 12:59:22 crc kubenswrapper[4724]: I1002 12:59:22.513223 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-2mrjk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e0d209f5-ad55-48f5-b4de-51aa5a972c19\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:21Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:21Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8f48\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T12:59:21Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-2mrjk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:22Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:22 crc kubenswrapper[4724]: I1002 12:59:22.522432 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-2mrjk" event={"ID":"e0d209f5-ad55-48f5-b4de-51aa5a972c19","Type":"ContainerStarted","Data":"e94572c741d70917793b2482faf4c7fba57dfc4a29ade8824b35109735b602c7"} Oct 02 12:59:22 crc kubenswrapper[4724]: I1002 12:59:22.522510 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-2mrjk" event={"ID":"e0d209f5-ad55-48f5-b4de-51aa5a972c19","Type":"ContainerStarted","Data":"76fe771f079eb5e0f551b36f27613c73fc092a61bf1192d8f7d12b43cd08f762"} Oct 02 12:59:22 crc kubenswrapper[4724]: I1002 12:59:22.529233 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"df76441bbfa05afa242a77e10e6f855d811a9432b790630a9fa5f359ccb6d457"} Oct 02 12:59:22 crc kubenswrapper[4724]: I1002 12:59:22.532108 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:19Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:22Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:22 crc kubenswrapper[4724]: I1002 12:59:22.550648 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-829dv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b90b9e02-3565-4ad5-8f8c-eec339fc499c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:21Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vjwcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vjwcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vjwcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vjwcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vjwcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vjwcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vjwcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T12:59:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-829dv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:22Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:22 crc kubenswrapper[4724]: I1002 12:59:22.564477 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-74k4t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6090eaa-c182-4788-950c-16352c271233\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:21Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:21Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fb6lr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fb6lr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T12:59:21Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-74k4t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:22Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:22 crc kubenswrapper[4724]: I1002 12:59:22.579563 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:19Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:22Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:22 crc kubenswrapper[4724]: I1002 12:59:22.598486 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://16ca55ea7e67fb6219b1c835d960d1f9ec6a18a98789ee961e44a278fad9add7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://72eefb1c1aad15d7ad34aa4384c7d0a20a4c886225cb7d28f0597cf01ad35693\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:22Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:22 crc kubenswrapper[4724]: I1002 12:59:22.619200 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1527f3c8-7fa1-404b-aafd-7cac512df49d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:58:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:58:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:58:56Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:58:56Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:58:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f058a4a1cf35b7f4800f872139322f2b0096d67548945ba03ecfe6775ffd5103\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:58:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c06801afff9aa03ae9e0bcd8a966f454d81fdfe876806eed22e9d93132989dae\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:58:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff4c45bdd946a4d50c7dc93752d3d1a71b10b830611b64e138a5e1c0eb9e5e30\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:58:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a3585f00b9d5a19821986d13b0d46f0db3bdd6eef3b4e28f3a63449e0aa0e55\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a3585f00b9d5a19821986d13b0d46f0db3bdd6eef3b4e28f3a63449e0aa0e55\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-02T12:59:19Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1002 12:59:13.975124 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1002 12:59:13.976423 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1894517521/tls.crt::/tmp/serving-cert-1894517521/tls.key\\\\\\\"\\\\nI1002 12:59:19.332110 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1002 12:59:19.335107 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1002 12:59:19.335132 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1002 12:59:19.335161 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1002 12:59:19.335167 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1002 12:59:19.339404 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1002 12:59:19.339433 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 12:59:19.339437 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 12:59:19.339442 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1002 12:59:19.339446 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1002 12:59:19.339452 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1002 12:59:19.339454 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1002 12:59:19.339646 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1002 12:59:19.342508 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T12:59:02Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://25e74bd1660931f7bb8dc824f985c539abcbd4c706a11edf7192876b8796e751\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:58:59Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://366ed0b9a29ba0b7606888087909b51b39abd60dd6b11e6509b8613913461b6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://366ed0b9a29ba0b7606888087909b51b39abd60dd6b11e6509b8613913461b6b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T12:58:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T12:58:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T12:58:56Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:22Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:22 crc kubenswrapper[4724]: I1002 12:59:22.641526 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-w58lt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4089ad23-969c-4222-a8ed-e141ec291e80\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sxqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sxqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sxqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sxqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sxqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sxqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sxqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sxqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9c7841b20007f0436754e669fb6f1a85de1f0ced1c0b2686ae0a20b56ab878f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9c7841b20007f0436754e669fb6f1a85de1f0ced1c0b2686ae0a20b56ab878f2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T12:59:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T12:59:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sxqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T12:59:21Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-w58lt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:22Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:22 crc kubenswrapper[4724]: I1002 12:59:22.657722 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:19Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:22Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:22 crc kubenswrapper[4724]: I1002 12:59:22.674985 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-pr276" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c67e3c27-01fd-47f5-a7fb-a2ae1e7753d4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c844q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T12:59:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-pr276\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:22Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:22 crc kubenswrapper[4724]: I1002 12:59:22.695384 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8902618f9e9502ff68d0658e199abb68025c551774c41022ed9f3f15dcccba47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:22Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:22 crc kubenswrapper[4724]: I1002 12:59:22.712150 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:22Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:22 crc kubenswrapper[4724]: I1002 12:59:22.728882 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df76441bbfa05afa242a77e10e6f855d811a9432b790630a9fa5f359ccb6d457\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:22Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:22 crc kubenswrapper[4724]: I1002 12:59:22.746144 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://16ca55ea7e67fb6219b1c835d960d1f9ec6a18a98789ee961e44a278fad9add7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://72eefb1c1aad15d7ad34aa4384c7d0a20a4c886225cb7d28f0597cf01ad35693\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:22Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:22 crc kubenswrapper[4724]: I1002 12:59:22.764048 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:19Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:22Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:22 crc kubenswrapper[4724]: I1002 12:59:22.784763 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-829dv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b90b9e02-3565-4ad5-8f8c-eec339fc499c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:21Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vjwcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vjwcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vjwcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vjwcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vjwcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vjwcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vjwcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T12:59:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-829dv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:22Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:22 crc kubenswrapper[4724]: I1002 12:59:22.802226 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-74k4t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6090eaa-c182-4788-950c-16352c271233\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:21Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:21Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fb6lr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fb6lr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T12:59:21Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-74k4t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:22Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:22 crc kubenswrapper[4724]: I1002 12:59:22.827466 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1527f3c8-7fa1-404b-aafd-7cac512df49d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:58:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:58:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:58:56Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:58:56Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:58:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f058a4a1cf35b7f4800f872139322f2b0096d67548945ba03ecfe6775ffd5103\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:58:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c06801afff9aa03ae9e0bcd8a966f454d81fdfe876806eed22e9d93132989dae\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:58:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff4c45bdd946a4d50c7dc93752d3d1a71b10b830611b64e138a5e1c0eb9e5e30\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:58:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a3585f00b9d5a19821986d13b0d46f0db3bdd6eef3b4e28f3a63449e0aa0e55\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a3585f00b9d5a19821986d13b0d46f0db3bdd6eef3b4e28f3a63449e0aa0e55\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-02T12:59:19Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1002 12:59:13.975124 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1002 12:59:13.976423 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1894517521/tls.crt::/tmp/serving-cert-1894517521/tls.key\\\\\\\"\\\\nI1002 12:59:19.332110 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1002 12:59:19.335107 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1002 12:59:19.335132 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1002 12:59:19.335161 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1002 12:59:19.335167 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1002 12:59:19.339404 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1002 12:59:19.339433 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 12:59:19.339437 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 12:59:19.339442 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1002 12:59:19.339446 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1002 12:59:19.339452 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1002 12:59:19.339454 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1002 12:59:19.339646 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1002 12:59:19.342508 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T12:59:02Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://25e74bd1660931f7bb8dc824f985c539abcbd4c706a11edf7192876b8796e751\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:58:59Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://366ed0b9a29ba0b7606888087909b51b39abd60dd6b11e6509b8613913461b6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://366ed0b9a29ba0b7606888087909b51b39abd60dd6b11e6509b8613913461b6b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T12:58:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T12:58:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T12:58:56Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:22Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:22 crc kubenswrapper[4724]: I1002 12:59:22.854490 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-w58lt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4089ad23-969c-4222-a8ed-e141ec291e80\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sxqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sxqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sxqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sxqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sxqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sxqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sxqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sxqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9c7841b20007f0436754e669fb6f1a85de1f0ced1c0b2686ae0a20b56ab878f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9c7841b20007f0436754e669fb6f1a85de1f0ced1c0b2686ae0a20b56ab878f2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T12:59:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T12:59:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sxqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T12:59:21Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-w58lt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:22Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:22 crc kubenswrapper[4724]: I1002 12:59:22.870407 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8902618f9e9502ff68d0658e199abb68025c551774c41022ed9f3f15dcccba47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:22Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:22 crc kubenswrapper[4724]: I1002 12:59:22.884595 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 12:59:22 crc kubenswrapper[4724]: E1002 12:59:22.884852 4724 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 12:59:26.884808748 +0000 UTC m=+31.339567889 (durationBeforeRetry 4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 12:59:22 crc kubenswrapper[4724]: I1002 12:59:22.884985 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 12:59:22 crc kubenswrapper[4724]: E1002 12:59:22.885138 4724 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Oct 02 12:59:22 crc kubenswrapper[4724]: E1002 12:59:22.885228 4724 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-02 12:59:26.885204878 +0000 UTC m=+31.339964179 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Oct 02 12:59:22 crc kubenswrapper[4724]: I1002 12:59:22.888342 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:22Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:22 crc kubenswrapper[4724]: I1002 12:59:22.907126 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:19Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:22Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:22 crc kubenswrapper[4724]: I1002 12:59:22.926415 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-pr276" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c67e3c27-01fd-47f5-a7fb-a2ae1e7753d4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5fe9713cb9b004911140ef4802a6755a4d2a781fde439d7318ace93e4b608cef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c844q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T12:59:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-pr276\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:22Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:22 crc kubenswrapper[4724]: I1002 12:59:22.938758 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-2mrjk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e0d209f5-ad55-48f5-b4de-51aa5a972c19\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e94572c741d70917793b2482faf4c7fba57dfc4a29ade8824b35109735b602c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8f48\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T12:59:21Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-2mrjk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:22Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:22 crc kubenswrapper[4724]: I1002 12:59:22.985821 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 12:59:22 crc kubenswrapper[4724]: I1002 12:59:22.985901 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 12:59:22 crc kubenswrapper[4724]: I1002 12:59:22.985968 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 12:59:22 crc kubenswrapper[4724]: E1002 12:59:22.986044 4724 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 02 12:59:22 crc kubenswrapper[4724]: E1002 12:59:22.986082 4724 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 02 12:59:22 crc kubenswrapper[4724]: E1002 12:59:22.986093 4724 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 02 12:59:22 crc kubenswrapper[4724]: E1002 12:59:22.986210 4724 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-02 12:59:26.986167403 +0000 UTC m=+31.440926524 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 02 12:59:22 crc kubenswrapper[4724]: E1002 12:59:22.986093 4724 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 02 12:59:22 crc kubenswrapper[4724]: E1002 12:59:22.986249 4724 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 02 12:59:22 crc kubenswrapper[4724]: E1002 12:59:22.986261 4724 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 02 12:59:22 crc kubenswrapper[4724]: E1002 12:59:22.986098 4724 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 02 12:59:22 crc kubenswrapper[4724]: E1002 12:59:22.986339 4724 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-10-02 12:59:26.986314037 +0000 UTC m=+31.441073338 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 02 12:59:22 crc kubenswrapper[4724]: E1002 12:59:22.986359 4724 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-10-02 12:59:26.986350668 +0000 UTC m=+31.441109999 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 02 12:59:23 crc kubenswrapper[4724]: I1002 12:59:23.030454 4724 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 02 12:59:23 crc kubenswrapper[4724]: I1002 12:59:23.034110 4724 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 02 12:59:23 crc kubenswrapper[4724]: I1002 12:59:23.040971 4724 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/kube-controller-manager-crc"] Oct 02 12:59:23 crc kubenswrapper[4724]: I1002 12:59:23.044421 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-2mrjk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e0d209f5-ad55-48f5-b4de-51aa5a972c19\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e94572c741d70917793b2482faf4c7fba57dfc4a29ade8824b35109735b602c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8f48\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T12:59:21Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-2mrjk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:23Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:23 crc kubenswrapper[4724]: I1002 12:59:23.057238 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df76441bbfa05afa242a77e10e6f855d811a9432b790630a9fa5f359ccb6d457\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:23Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:23 crc kubenswrapper[4724]: I1002 12:59:23.070709 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://16ca55ea7e67fb6219b1c835d960d1f9ec6a18a98789ee961e44a278fad9add7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://72eefb1c1aad15d7ad34aa4384c7d0a20a4c886225cb7d28f0597cf01ad35693\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:23Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:23 crc kubenswrapper[4724]: I1002 12:59:23.085703 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:19Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:23Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:23 crc kubenswrapper[4724]: I1002 12:59:23.108521 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-829dv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b90b9e02-3565-4ad5-8f8c-eec339fc499c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:21Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vjwcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vjwcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vjwcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vjwcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vjwcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vjwcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vjwcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T12:59:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-829dv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:23Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:23 crc kubenswrapper[4724]: I1002 12:59:23.125817 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-74k4t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6090eaa-c182-4788-950c-16352c271233\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:21Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:21Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fb6lr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fb6lr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T12:59:21Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-74k4t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:23Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:23 crc kubenswrapper[4724]: I1002 12:59:23.143134 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1527f3c8-7fa1-404b-aafd-7cac512df49d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:58:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:58:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:58:56Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:58:56Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:58:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f058a4a1cf35b7f4800f872139322f2b0096d67548945ba03ecfe6775ffd5103\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:58:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c06801afff9aa03ae9e0bcd8a966f454d81fdfe876806eed22e9d93132989dae\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:58:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff4c45bdd946a4d50c7dc93752d3d1a71b10b830611b64e138a5e1c0eb9e5e30\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:58:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a3585f00b9d5a19821986d13b0d46f0db3bdd6eef3b4e28f3a63449e0aa0e55\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a3585f00b9d5a19821986d13b0d46f0db3bdd6eef3b4e28f3a63449e0aa0e55\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-02T12:59:19Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1002 12:59:13.975124 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1002 12:59:13.976423 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1894517521/tls.crt::/tmp/serving-cert-1894517521/tls.key\\\\\\\"\\\\nI1002 12:59:19.332110 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1002 12:59:19.335107 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1002 12:59:19.335132 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1002 12:59:19.335161 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1002 12:59:19.335167 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1002 12:59:19.339404 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1002 12:59:19.339433 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 12:59:19.339437 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 12:59:19.339442 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1002 12:59:19.339446 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1002 12:59:19.339452 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1002 12:59:19.339454 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1002 12:59:19.339646 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1002 12:59:19.342508 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T12:59:02Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://25e74bd1660931f7bb8dc824f985c539abcbd4c706a11edf7192876b8796e751\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:58:59Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://366ed0b9a29ba0b7606888087909b51b39abd60dd6b11e6509b8613913461b6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://366ed0b9a29ba0b7606888087909b51b39abd60dd6b11e6509b8613913461b6b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T12:58:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T12:58:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T12:58:56Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:23Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:23 crc kubenswrapper[4724]: I1002 12:59:23.164756 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-w58lt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4089ad23-969c-4222-a8ed-e141ec291e80\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sxqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sxqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sxqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sxqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sxqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sxqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sxqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sxqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9c7841b20007f0436754e669fb6f1a85de1f0ced1c0b2686ae0a20b56ab878f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9c7841b20007f0436754e669fb6f1a85de1f0ced1c0b2686ae0a20b56ab878f2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T12:59:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T12:59:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sxqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T12:59:21Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-w58lt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:23Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:23 crc kubenswrapper[4724]: I1002 12:59:23.178176 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8902618f9e9502ff68d0658e199abb68025c551774c41022ed9f3f15dcccba47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:23Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:23 crc kubenswrapper[4724]: I1002 12:59:23.193785 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:23Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:23 crc kubenswrapper[4724]: I1002 12:59:23.207018 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:19Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:23Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:23 crc kubenswrapper[4724]: I1002 12:59:23.221970 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-pr276" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c67e3c27-01fd-47f5-a7fb-a2ae1e7753d4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5fe9713cb9b004911140ef4802a6755a4d2a781fde439d7318ace93e4b608cef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c844q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T12:59:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-pr276\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:23Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:23 crc kubenswrapper[4724]: I1002 12:59:23.237560 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1527f3c8-7fa1-404b-aafd-7cac512df49d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:58:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:58:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:58:56Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:58:56Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:58:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f058a4a1cf35b7f4800f872139322f2b0096d67548945ba03ecfe6775ffd5103\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:58:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c06801afff9aa03ae9e0bcd8a966f454d81fdfe876806eed22e9d93132989dae\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:58:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff4c45bdd946a4d50c7dc93752d3d1a71b10b830611b64e138a5e1c0eb9e5e30\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:58:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a3585f00b9d5a19821986d13b0d46f0db3bdd6eef3b4e28f3a63449e0aa0e55\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a3585f00b9d5a19821986d13b0d46f0db3bdd6eef3b4e28f3a63449e0aa0e55\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-02T12:59:19Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1002 12:59:13.975124 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1002 12:59:13.976423 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1894517521/tls.crt::/tmp/serving-cert-1894517521/tls.key\\\\\\\"\\\\nI1002 12:59:19.332110 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1002 12:59:19.335107 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1002 12:59:19.335132 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1002 12:59:19.335161 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1002 12:59:19.335167 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1002 12:59:19.339404 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1002 12:59:19.339433 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 12:59:19.339437 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 12:59:19.339442 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1002 12:59:19.339446 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1002 12:59:19.339452 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1002 12:59:19.339454 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1002 12:59:19.339646 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1002 12:59:19.342508 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T12:59:02Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://25e74bd1660931f7bb8dc824f985c539abcbd4c706a11edf7192876b8796e751\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:58:59Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://366ed0b9a29ba0b7606888087909b51b39abd60dd6b11e6509b8613913461b6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://366ed0b9a29ba0b7606888087909b51b39abd60dd6b11e6509b8613913461b6b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T12:58:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T12:58:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T12:58:56Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:23Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:23 crc kubenswrapper[4724]: I1002 12:59:23.259333 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-w58lt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4089ad23-969c-4222-a8ed-e141ec291e80\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sxqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sxqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sxqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sxqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sxqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sxqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sxqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sxqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9c7841b20007f0436754e669fb6f1a85de1f0ced1c0b2686ae0a20b56ab878f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9c7841b20007f0436754e669fb6f1a85de1f0ced1c0b2686ae0a20b56ab878f2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T12:59:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T12:59:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sxqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T12:59:21Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-w58lt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:23Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:23 crc kubenswrapper[4724]: I1002 12:59:23.274378 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:23Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:23 crc kubenswrapper[4724]: I1002 12:59:23.288021 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:19Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:23Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:23 crc kubenswrapper[4724]: I1002 12:59:23.303583 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-pr276" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c67e3c27-01fd-47f5-a7fb-a2ae1e7753d4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5fe9713cb9b004911140ef4802a6755a4d2a781fde439d7318ace93e4b608cef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c844q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T12:59:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-pr276\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:23Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:23 crc kubenswrapper[4724]: I1002 12:59:23.312639 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 12:59:23 crc kubenswrapper[4724]: I1002 12:59:23.312644 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 12:59:23 crc kubenswrapper[4724]: I1002 12:59:23.312667 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 12:59:23 crc kubenswrapper[4724]: E1002 12:59:23.312893 4724 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 02 12:59:23 crc kubenswrapper[4724]: E1002 12:59:23.312820 4724 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 02 12:59:23 crc kubenswrapper[4724]: E1002 12:59:23.313021 4724 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 02 12:59:23 crc kubenswrapper[4724]: I1002 12:59:23.318145 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8902618f9e9502ff68d0658e199abb68025c551774c41022ed9f3f15dcccba47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:23Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:23 crc kubenswrapper[4724]: I1002 12:59:23.328645 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-2mrjk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e0d209f5-ad55-48f5-b4de-51aa5a972c19\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e94572c741d70917793b2482faf4c7fba57dfc4a29ade8824b35109735b602c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8f48\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T12:59:21Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-2mrjk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:23Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:23 crc kubenswrapper[4724]: I1002 12:59:23.341102 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"92a0e520-2ed8-4d81-8de4-c0f98916b012\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:58:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:58:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:58:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8d0d24fc1f56963b40bca48e3e923a5b35f9bec48f0b4fabbbdc9163762b2aa6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:58:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cdf23aa5cabd12ba0af299a7184df0341b5a7f7fb6f8159a10c932599fcef08b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:58:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a087f1c52a561564ac405ef652e8cb1542077e507acff22e1189f7370b7ca322\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:58:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a675aca2218d07044b6e9a391f2b60f5cc822790011af97b0c0b704c1bd98d78\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:58:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T12:58:56Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:23Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:23 crc kubenswrapper[4724]: I1002 12:59:23.354401 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://16ca55ea7e67fb6219b1c835d960d1f9ec6a18a98789ee961e44a278fad9add7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://72eefb1c1aad15d7ad34aa4384c7d0a20a4c886225cb7d28f0597cf01ad35693\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:23Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:23 crc kubenswrapper[4724]: I1002 12:59:23.367598 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:19Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:23Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:23 crc kubenswrapper[4724]: I1002 12:59:23.383356 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-829dv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b90b9e02-3565-4ad5-8f8c-eec339fc499c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:21Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vjwcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vjwcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vjwcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vjwcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vjwcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vjwcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vjwcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T12:59:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-829dv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:23Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:23 crc kubenswrapper[4724]: I1002 12:59:23.396610 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-74k4t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6090eaa-c182-4788-950c-16352c271233\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:21Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:21Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fb6lr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fb6lr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T12:59:21Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-74k4t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:23Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:23 crc kubenswrapper[4724]: I1002 12:59:23.408558 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df76441bbfa05afa242a77e10e6f855d811a9432b790630a9fa5f359ccb6d457\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:23Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:23 crc kubenswrapper[4724]: I1002 12:59:23.542091 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-74k4t" event={"ID":"f6090eaa-c182-4788-950c-16352c271233","Type":"ContainerStarted","Data":"6a3396cd890f4ef8af351a4a3065c2324c20bddc7e079bb78e5a02e9fc8269c5"} Oct 02 12:59:23 crc kubenswrapper[4724]: I1002 12:59:23.543583 4724 generic.go:334] "Generic (PLEG): container finished" podID="b90b9e02-3565-4ad5-8f8c-eec339fc499c" containerID="55a18158dd67ef4667e4a6bc9684c4f4824f848b6e151a0f99c73e5ecfcc5ab1" exitCode=0 Oct 02 12:59:23 crc kubenswrapper[4724]: I1002 12:59:23.543642 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-829dv" event={"ID":"b90b9e02-3565-4ad5-8f8c-eec339fc499c","Type":"ContainerDied","Data":"55a18158dd67ef4667e4a6bc9684c4f4824f848b6e151a0f99c73e5ecfcc5ab1"} Oct 02 12:59:23 crc kubenswrapper[4724]: I1002 12:59:23.548320 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-w58lt" event={"ID":"4089ad23-969c-4222-a8ed-e141ec291e80","Type":"ContainerStarted","Data":"300ea92580dcf51b685b378397b0dd067899d424f4394bce09a8242415acba64"} Oct 02 12:59:23 crc kubenswrapper[4724]: I1002 12:59:23.548369 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-w58lt" event={"ID":"4089ad23-969c-4222-a8ed-e141ec291e80","Type":"ContainerStarted","Data":"6bc33e00e91c1177fe2762cb5295c78f4b00c1861d44446269e2e7668b25ac83"} Oct 02 12:59:23 crc kubenswrapper[4724]: I1002 12:59:23.548383 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-w58lt" event={"ID":"4089ad23-969c-4222-a8ed-e141ec291e80","Type":"ContainerStarted","Data":"b3940d1762b165816101228f10689c4897ca3909b295c51368935e2d28d32e9c"} Oct 02 12:59:23 crc kubenswrapper[4724]: I1002 12:59:23.548396 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-w58lt" event={"ID":"4089ad23-969c-4222-a8ed-e141ec291e80","Type":"ContainerStarted","Data":"197d101ac60dc4e22bd91f0bccaf0c5d9461d2bb94425b5c6d4ca75aad0f7e25"} Oct 02 12:59:23 crc kubenswrapper[4724]: I1002 12:59:23.548408 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-w58lt" event={"ID":"4089ad23-969c-4222-a8ed-e141ec291e80","Type":"ContainerStarted","Data":"1f4412d5281a9c03ea5b251d17371e363b9c5a970bad0db12adf3f75ddd1f955"} Oct 02 12:59:23 crc kubenswrapper[4724]: I1002 12:59:23.548421 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-w58lt" event={"ID":"4089ad23-969c-4222-a8ed-e141ec291e80","Type":"ContainerStarted","Data":"4186fbd2d282edbe581a4d8d3616dff86c3e6cdbe797999fa57f6034870e7742"} Oct 02 12:59:23 crc kubenswrapper[4724]: I1002 12:59:23.561071 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:19Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:23Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:23 crc kubenswrapper[4724]: I1002 12:59:23.577243 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-829dv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b90b9e02-3565-4ad5-8f8c-eec339fc499c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:21Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vjwcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vjwcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vjwcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vjwcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vjwcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vjwcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vjwcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T12:59:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-829dv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:23Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:23 crc kubenswrapper[4724]: I1002 12:59:23.594266 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-74k4t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6090eaa-c182-4788-950c-16352c271233\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a3396cd890f4ef8af351a4a3065c2324c20bddc7e079bb78e5a02e9fc8269c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fb6lr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4d5dbca156adf53ca2d3b237e15b6dd4387249e73dbf1a7fb3e7e39c53d1b548\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fb6lr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T12:59:21Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-74k4t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:23Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:23 crc kubenswrapper[4724]: I1002 12:59:23.606981 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df76441bbfa05afa242a77e10e6f855d811a9432b790630a9fa5f359ccb6d457\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:23Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:23 crc kubenswrapper[4724]: I1002 12:59:23.619473 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://16ca55ea7e67fb6219b1c835d960d1f9ec6a18a98789ee961e44a278fad9add7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://72eefb1c1aad15d7ad34aa4384c7d0a20a4c886225cb7d28f0597cf01ad35693\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:23Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:23 crc kubenswrapper[4724]: I1002 12:59:23.635232 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1527f3c8-7fa1-404b-aafd-7cac512df49d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:58:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:58:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:58:56Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:58:56Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:58:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f058a4a1cf35b7f4800f872139322f2b0096d67548945ba03ecfe6775ffd5103\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:58:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c06801afff9aa03ae9e0bcd8a966f454d81fdfe876806eed22e9d93132989dae\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:58:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff4c45bdd946a4d50c7dc93752d3d1a71b10b830611b64e138a5e1c0eb9e5e30\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:58:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a3585f00b9d5a19821986d13b0d46f0db3bdd6eef3b4e28f3a63449e0aa0e55\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a3585f00b9d5a19821986d13b0d46f0db3bdd6eef3b4e28f3a63449e0aa0e55\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-02T12:59:19Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1002 12:59:13.975124 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1002 12:59:13.976423 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1894517521/tls.crt::/tmp/serving-cert-1894517521/tls.key\\\\\\\"\\\\nI1002 12:59:19.332110 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1002 12:59:19.335107 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1002 12:59:19.335132 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1002 12:59:19.335161 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1002 12:59:19.335167 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1002 12:59:19.339404 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1002 12:59:19.339433 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 12:59:19.339437 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 12:59:19.339442 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1002 12:59:19.339446 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1002 12:59:19.339452 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1002 12:59:19.339454 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1002 12:59:19.339646 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1002 12:59:19.342508 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T12:59:02Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://25e74bd1660931f7bb8dc824f985c539abcbd4c706a11edf7192876b8796e751\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:58:59Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://366ed0b9a29ba0b7606888087909b51b39abd60dd6b11e6509b8613913461b6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://366ed0b9a29ba0b7606888087909b51b39abd60dd6b11e6509b8613913461b6b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T12:58:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T12:58:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T12:58:56Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:23Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:23 crc kubenswrapper[4724]: I1002 12:59:23.654569 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-w58lt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4089ad23-969c-4222-a8ed-e141ec291e80\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sxqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sxqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sxqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sxqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sxqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sxqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sxqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sxqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9c7841b20007f0436754e669fb6f1a85de1f0ced1c0b2686ae0a20b56ab878f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9c7841b20007f0436754e669fb6f1a85de1f0ced1c0b2686ae0a20b56ab878f2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T12:59:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T12:59:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sxqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T12:59:21Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-w58lt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:23Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:23 crc kubenswrapper[4724]: I1002 12:59:23.671648 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:19Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:23Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:23 crc kubenswrapper[4724]: I1002 12:59:23.686070 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-pr276" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c67e3c27-01fd-47f5-a7fb-a2ae1e7753d4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5fe9713cb9b004911140ef4802a6755a4d2a781fde439d7318ace93e4b608cef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c844q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T12:59:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-pr276\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:23Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:23 crc kubenswrapper[4724]: I1002 12:59:23.706093 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8902618f9e9502ff68d0658e199abb68025c551774c41022ed9f3f15dcccba47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:23Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:23 crc kubenswrapper[4724]: I1002 12:59:23.719551 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:23Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:23 crc kubenswrapper[4724]: I1002 12:59:23.733615 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"92a0e520-2ed8-4d81-8de4-c0f98916b012\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:58:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:58:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:58:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8d0d24fc1f56963b40bca48e3e923a5b35f9bec48f0b4fabbbdc9163762b2aa6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:58:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cdf23aa5cabd12ba0af299a7184df0341b5a7f7fb6f8159a10c932599fcef08b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:58:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a087f1c52a561564ac405ef652e8cb1542077e507acff22e1189f7370b7ca322\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:58:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a675aca2218d07044b6e9a391f2b60f5cc822790011af97b0c0b704c1bd98d78\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:58:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T12:58:56Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:23Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:23 crc kubenswrapper[4724]: I1002 12:59:23.782128 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-2mrjk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e0d209f5-ad55-48f5-b4de-51aa5a972c19\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e94572c741d70917793b2482faf4c7fba57dfc4a29ade8824b35109735b602c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8f48\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T12:59:21Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-2mrjk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:23Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:23 crc kubenswrapper[4724]: I1002 12:59:23.809020 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:19Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:23Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:23 crc kubenswrapper[4724]: I1002 12:59:23.845252 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-pr276" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c67e3c27-01fd-47f5-a7fb-a2ae1e7753d4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5fe9713cb9b004911140ef4802a6755a4d2a781fde439d7318ace93e4b608cef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c844q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T12:59:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-pr276\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:23Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:23 crc kubenswrapper[4724]: I1002 12:59:23.886703 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8902618f9e9502ff68d0658e199abb68025c551774c41022ed9f3f15dcccba47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:23Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:23 crc kubenswrapper[4724]: I1002 12:59:23.926635 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:23Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:23 crc kubenswrapper[4724]: I1002 12:59:23.964690 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"92a0e520-2ed8-4d81-8de4-c0f98916b012\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:58:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:58:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:58:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8d0d24fc1f56963b40bca48e3e923a5b35f9bec48f0b4fabbbdc9163762b2aa6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:58:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cdf23aa5cabd12ba0af299a7184df0341b5a7f7fb6f8159a10c932599fcef08b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:58:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a087f1c52a561564ac405ef652e8cb1542077e507acff22e1189f7370b7ca322\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:58:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a675aca2218d07044b6e9a391f2b60f5cc822790011af97b0c0b704c1bd98d78\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:58:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T12:58:56Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:23Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:24 crc kubenswrapper[4724]: I1002 12:59:24.002431 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-2mrjk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e0d209f5-ad55-48f5-b4de-51aa5a972c19\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e94572c741d70917793b2482faf4c7fba57dfc4a29ade8824b35109735b602c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8f48\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T12:59:21Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-2mrjk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:24Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:24 crc kubenswrapper[4724]: I1002 12:59:24.046760 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:19Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:24Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:24 crc kubenswrapper[4724]: I1002 12:59:24.089000 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-829dv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b90b9e02-3565-4ad5-8f8c-eec339fc499c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:21Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vjwcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://55a18158dd67ef4667e4a6bc9684c4f4824f848b6e151a0f99c73e5ecfcc5ab1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://55a18158dd67ef4667e4a6bc9684c4f4824f848b6e151a0f99c73e5ecfcc5ab1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T12:59:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T12:59:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vjwcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vjwcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vjwcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vjwcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vjwcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vjwcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T12:59:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-829dv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:24Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:24 crc kubenswrapper[4724]: I1002 12:59:24.125401 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-74k4t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6090eaa-c182-4788-950c-16352c271233\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a3396cd890f4ef8af351a4a3065c2324c20bddc7e079bb78e5a02e9fc8269c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fb6lr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4d5dbca156adf53ca2d3b237e15b6dd4387249e73dbf1a7fb3e7e39c53d1b548\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fb6lr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T12:59:21Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-74k4t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:24Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:24 crc kubenswrapper[4724]: I1002 12:59:24.166649 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df76441bbfa05afa242a77e10e6f855d811a9432b790630a9fa5f359ccb6d457\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:24Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:24 crc kubenswrapper[4724]: I1002 12:59:24.204769 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://16ca55ea7e67fb6219b1c835d960d1f9ec6a18a98789ee961e44a278fad9add7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://72eefb1c1aad15d7ad34aa4384c7d0a20a4c886225cb7d28f0597cf01ad35693\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:24Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:24 crc kubenswrapper[4724]: I1002 12:59:24.245520 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1527f3c8-7fa1-404b-aafd-7cac512df49d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:58:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:58:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:58:56Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:58:56Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:58:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f058a4a1cf35b7f4800f872139322f2b0096d67548945ba03ecfe6775ffd5103\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:58:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c06801afff9aa03ae9e0bcd8a966f454d81fdfe876806eed22e9d93132989dae\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:58:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff4c45bdd946a4d50c7dc93752d3d1a71b10b830611b64e138a5e1c0eb9e5e30\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:58:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a3585f00b9d5a19821986d13b0d46f0db3bdd6eef3b4e28f3a63449e0aa0e55\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a3585f00b9d5a19821986d13b0d46f0db3bdd6eef3b4e28f3a63449e0aa0e55\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-02T12:59:19Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1002 12:59:13.975124 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1002 12:59:13.976423 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1894517521/tls.crt::/tmp/serving-cert-1894517521/tls.key\\\\\\\"\\\\nI1002 12:59:19.332110 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1002 12:59:19.335107 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1002 12:59:19.335132 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1002 12:59:19.335161 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1002 12:59:19.335167 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1002 12:59:19.339404 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1002 12:59:19.339433 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 12:59:19.339437 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 12:59:19.339442 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1002 12:59:19.339446 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1002 12:59:19.339452 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1002 12:59:19.339454 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1002 12:59:19.339646 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1002 12:59:19.342508 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T12:59:02Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://25e74bd1660931f7bb8dc824f985c539abcbd4c706a11edf7192876b8796e751\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:58:59Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://366ed0b9a29ba0b7606888087909b51b39abd60dd6b11e6509b8613913461b6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://366ed0b9a29ba0b7606888087909b51b39abd60dd6b11e6509b8613913461b6b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T12:58:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T12:58:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T12:58:56Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:24Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:24 crc kubenswrapper[4724]: I1002 12:59:24.291128 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-w58lt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4089ad23-969c-4222-a8ed-e141ec291e80\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sxqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sxqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sxqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sxqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sxqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sxqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sxqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sxqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9c7841b20007f0436754e669fb6f1a85de1f0ced1c0b2686ae0a20b56ab878f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9c7841b20007f0436754e669fb6f1a85de1f0ced1c0b2686ae0a20b56ab878f2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T12:59:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T12:59:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sxqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T12:59:21Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-w58lt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:24Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:24 crc kubenswrapper[4724]: I1002 12:59:24.554565 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-829dv" event={"ID":"b90b9e02-3565-4ad5-8f8c-eec339fc499c","Type":"ContainerStarted","Data":"db711940b17fde8f8a00b44de8366688eb730facc06a174bf09456c12c673eb0"} Oct 02 12:59:24 crc kubenswrapper[4724]: I1002 12:59:24.574210 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8902618f9e9502ff68d0658e199abb68025c551774c41022ed9f3f15dcccba47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:24Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:24 crc kubenswrapper[4724]: I1002 12:59:24.589009 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:24Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:24 crc kubenswrapper[4724]: I1002 12:59:24.602926 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:19Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:24Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:24 crc kubenswrapper[4724]: I1002 12:59:24.629054 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-pr276" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c67e3c27-01fd-47f5-a7fb-a2ae1e7753d4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5fe9713cb9b004911140ef4802a6755a4d2a781fde439d7318ace93e4b608cef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c844q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T12:59:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-pr276\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:24Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:24 crc kubenswrapper[4724]: I1002 12:59:24.643379 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"92a0e520-2ed8-4d81-8de4-c0f98916b012\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:58:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:58:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:58:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8d0d24fc1f56963b40bca48e3e923a5b35f9bec48f0b4fabbbdc9163762b2aa6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:58:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cdf23aa5cabd12ba0af299a7184df0341b5a7f7fb6f8159a10c932599fcef08b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:58:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a087f1c52a561564ac405ef652e8cb1542077e507acff22e1189f7370b7ca322\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:58:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a675aca2218d07044b6e9a391f2b60f5cc822790011af97b0c0b704c1bd98d78\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:58:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T12:58:56Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:24Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:24 crc kubenswrapper[4724]: I1002 12:59:24.658221 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-2mrjk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e0d209f5-ad55-48f5-b4de-51aa5a972c19\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e94572c741d70917793b2482faf4c7fba57dfc4a29ade8824b35109735b602c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8f48\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T12:59:21Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-2mrjk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:24Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:24 crc kubenswrapper[4724]: I1002 12:59:24.672495 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df76441bbfa05afa242a77e10e6f855d811a9432b790630a9fa5f359ccb6d457\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:24Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:24 crc kubenswrapper[4724]: I1002 12:59:24.686374 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://16ca55ea7e67fb6219b1c835d960d1f9ec6a18a98789ee961e44a278fad9add7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://72eefb1c1aad15d7ad34aa4384c7d0a20a4c886225cb7d28f0597cf01ad35693\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:24Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:24 crc kubenswrapper[4724]: I1002 12:59:24.700554 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:19Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:24Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:24 crc kubenswrapper[4724]: I1002 12:59:24.718243 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-829dv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b90b9e02-3565-4ad5-8f8c-eec339fc499c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:21Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vjwcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://55a18158dd67ef4667e4a6bc9684c4f4824f848b6e151a0f99c73e5ecfcc5ab1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://55a18158dd67ef4667e4a6bc9684c4f4824f848b6e151a0f99c73e5ecfcc5ab1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T12:59:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T12:59:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vjwcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://db711940b17fde8f8a00b44de8366688eb730facc06a174bf09456c12c673eb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vjwcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vjwcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vjwcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vjwcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vjwcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T12:59:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-829dv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:24Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:24 crc kubenswrapper[4724]: I1002 12:59:24.730919 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-74k4t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6090eaa-c182-4788-950c-16352c271233\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a3396cd890f4ef8af351a4a3065c2324c20bddc7e079bb78e5a02e9fc8269c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fb6lr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4d5dbca156adf53ca2d3b237e15b6dd4387249e73dbf1a7fb3e7e39c53d1b548\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fb6lr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T12:59:21Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-74k4t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:24Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:24 crc kubenswrapper[4724]: I1002 12:59:24.772395 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-w58lt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4089ad23-969c-4222-a8ed-e141ec291e80\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sxqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sxqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sxqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sxqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sxqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sxqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sxqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sxqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9c7841b20007f0436754e669fb6f1a85de1f0ced1c0b2686ae0a20b56ab878f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9c7841b20007f0436754e669fb6f1a85de1f0ced1c0b2686ae0a20b56ab878f2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T12:59:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T12:59:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sxqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T12:59:21Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-w58lt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:24Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:24 crc kubenswrapper[4724]: I1002 12:59:24.806813 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1527f3c8-7fa1-404b-aafd-7cac512df49d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:58:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:58:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:58:56Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:58:56Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:58:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f058a4a1cf35b7f4800f872139322f2b0096d67548945ba03ecfe6775ffd5103\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:58:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c06801afff9aa03ae9e0bcd8a966f454d81fdfe876806eed22e9d93132989dae\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:58:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff4c45bdd946a4d50c7dc93752d3d1a71b10b830611b64e138a5e1c0eb9e5e30\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:58:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a3585f00b9d5a19821986d13b0d46f0db3bdd6eef3b4e28f3a63449e0aa0e55\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a3585f00b9d5a19821986d13b0d46f0db3bdd6eef3b4e28f3a63449e0aa0e55\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-02T12:59:19Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1002 12:59:13.975124 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1002 12:59:13.976423 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1894517521/tls.crt::/tmp/serving-cert-1894517521/tls.key\\\\\\\"\\\\nI1002 12:59:19.332110 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1002 12:59:19.335107 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1002 12:59:19.335132 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1002 12:59:19.335161 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1002 12:59:19.335167 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1002 12:59:19.339404 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1002 12:59:19.339433 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 12:59:19.339437 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 12:59:19.339442 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1002 12:59:19.339446 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1002 12:59:19.339452 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1002 12:59:19.339454 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1002 12:59:19.339646 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1002 12:59:19.342508 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T12:59:02Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://25e74bd1660931f7bb8dc824f985c539abcbd4c706a11edf7192876b8796e751\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:58:59Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://366ed0b9a29ba0b7606888087909b51b39abd60dd6b11e6509b8613913461b6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://366ed0b9a29ba0b7606888087909b51b39abd60dd6b11e6509b8613913461b6b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T12:58:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T12:58:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T12:58:56Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:24Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:25 crc kubenswrapper[4724]: I1002 12:59:25.312877 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 12:59:25 crc kubenswrapper[4724]: I1002 12:59:25.312885 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 12:59:25 crc kubenswrapper[4724]: E1002 12:59:25.313031 4724 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 02 12:59:25 crc kubenswrapper[4724]: E1002 12:59:25.313114 4724 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 02 12:59:25 crc kubenswrapper[4724]: I1002 12:59:25.312906 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 12:59:25 crc kubenswrapper[4724]: E1002 12:59:25.313192 4724 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 02 12:59:25 crc kubenswrapper[4724]: I1002 12:59:25.566606 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-w58lt" event={"ID":"4089ad23-969c-4222-a8ed-e141ec291e80","Type":"ContainerStarted","Data":"3c6cd68ea9e76b6dcb1649fafb808953b30389fcaee674b611c655c239a36e7f"} Oct 02 12:59:25 crc kubenswrapper[4724]: I1002 12:59:25.568286 4724 generic.go:334] "Generic (PLEG): container finished" podID="b90b9e02-3565-4ad5-8f8c-eec339fc499c" containerID="db711940b17fde8f8a00b44de8366688eb730facc06a174bf09456c12c673eb0" exitCode=0 Oct 02 12:59:25 crc kubenswrapper[4724]: I1002 12:59:25.568351 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-829dv" event={"ID":"b90b9e02-3565-4ad5-8f8c-eec339fc499c","Type":"ContainerDied","Data":"db711940b17fde8f8a00b44de8366688eb730facc06a174bf09456c12c673eb0"} Oct 02 12:59:25 crc kubenswrapper[4724]: I1002 12:59:25.582574 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"92a0e520-2ed8-4d81-8de4-c0f98916b012\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:58:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:58:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:58:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8d0d24fc1f56963b40bca48e3e923a5b35f9bec48f0b4fabbbdc9163762b2aa6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:58:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cdf23aa5cabd12ba0af299a7184df0341b5a7f7fb6f8159a10c932599fcef08b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:58:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a087f1c52a561564ac405ef652e8cb1542077e507acff22e1189f7370b7ca322\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:58:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a675aca2218d07044b6e9a391f2b60f5cc822790011af97b0c0b704c1bd98d78\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:58:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T12:58:56Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:25Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:25 crc kubenswrapper[4724]: I1002 12:59:25.594514 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-2mrjk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e0d209f5-ad55-48f5-b4de-51aa5a972c19\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e94572c741d70917793b2482faf4c7fba57dfc4a29ade8824b35109735b602c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8f48\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T12:59:21Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-2mrjk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:25Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:25 crc kubenswrapper[4724]: I1002 12:59:25.609066 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df76441bbfa05afa242a77e10e6f855d811a9432b790630a9fa5f359ccb6d457\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:25Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:25 crc kubenswrapper[4724]: I1002 12:59:25.626053 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://16ca55ea7e67fb6219b1c835d960d1f9ec6a18a98789ee961e44a278fad9add7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://72eefb1c1aad15d7ad34aa4384c7d0a20a4c886225cb7d28f0597cf01ad35693\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:25Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:25 crc kubenswrapper[4724]: I1002 12:59:25.640822 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:19Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:25Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:25 crc kubenswrapper[4724]: I1002 12:59:25.655126 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-829dv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b90b9e02-3565-4ad5-8f8c-eec339fc499c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:21Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vjwcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://55a18158dd67ef4667e4a6bc9684c4f4824f848b6e151a0f99c73e5ecfcc5ab1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://55a18158dd67ef4667e4a6bc9684c4f4824f848b6e151a0f99c73e5ecfcc5ab1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T12:59:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T12:59:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vjwcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://db711940b17fde8f8a00b44de8366688eb730facc06a174bf09456c12c673eb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://db711940b17fde8f8a00b44de8366688eb730facc06a174bf09456c12c673eb0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T12:59:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T12:59:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vjwcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vjwcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vjwcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vjwcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vjwcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T12:59:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-829dv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:25Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:25 crc kubenswrapper[4724]: I1002 12:59:25.669211 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-74k4t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6090eaa-c182-4788-950c-16352c271233\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a3396cd890f4ef8af351a4a3065c2324c20bddc7e079bb78e5a02e9fc8269c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fb6lr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4d5dbca156adf53ca2d3b237e15b6dd4387249e73dbf1a7fb3e7e39c53d1b548\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fb6lr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T12:59:21Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-74k4t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:25Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:25 crc kubenswrapper[4724]: I1002 12:59:25.684110 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1527f3c8-7fa1-404b-aafd-7cac512df49d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:58:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:58:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:58:56Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:58:56Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:58:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f058a4a1cf35b7f4800f872139322f2b0096d67548945ba03ecfe6775ffd5103\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:58:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c06801afff9aa03ae9e0bcd8a966f454d81fdfe876806eed22e9d93132989dae\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:58:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff4c45bdd946a4d50c7dc93752d3d1a71b10b830611b64e138a5e1c0eb9e5e30\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:58:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a3585f00b9d5a19821986d13b0d46f0db3bdd6eef3b4e28f3a63449e0aa0e55\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a3585f00b9d5a19821986d13b0d46f0db3bdd6eef3b4e28f3a63449e0aa0e55\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-02T12:59:19Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1002 12:59:13.975124 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1002 12:59:13.976423 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1894517521/tls.crt::/tmp/serving-cert-1894517521/tls.key\\\\\\\"\\\\nI1002 12:59:19.332110 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1002 12:59:19.335107 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1002 12:59:19.335132 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1002 12:59:19.335161 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1002 12:59:19.335167 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1002 12:59:19.339404 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1002 12:59:19.339433 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 12:59:19.339437 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 12:59:19.339442 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1002 12:59:19.339446 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1002 12:59:19.339452 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1002 12:59:19.339454 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1002 12:59:19.339646 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1002 12:59:19.342508 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T12:59:02Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://25e74bd1660931f7bb8dc824f985c539abcbd4c706a11edf7192876b8796e751\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:58:59Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://366ed0b9a29ba0b7606888087909b51b39abd60dd6b11e6509b8613913461b6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://366ed0b9a29ba0b7606888087909b51b39abd60dd6b11e6509b8613913461b6b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T12:58:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T12:58:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T12:58:56Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:25Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:25 crc kubenswrapper[4724]: I1002 12:59:25.705104 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-w58lt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4089ad23-969c-4222-a8ed-e141ec291e80\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sxqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sxqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sxqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sxqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sxqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sxqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sxqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sxqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9c7841b20007f0436754e669fb6f1a85de1f0ced1c0b2686ae0a20b56ab878f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9c7841b20007f0436754e669fb6f1a85de1f0ced1c0b2686ae0a20b56ab878f2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T12:59:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T12:59:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sxqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T12:59:21Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-w58lt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:25Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:25 crc kubenswrapper[4724]: I1002 12:59:25.717693 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8902618f9e9502ff68d0658e199abb68025c551774c41022ed9f3f15dcccba47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:25Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:25 crc kubenswrapper[4724]: I1002 12:59:25.729566 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:25Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:25 crc kubenswrapper[4724]: I1002 12:59:25.746625 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:19Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:25Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:25 crc kubenswrapper[4724]: I1002 12:59:25.762269 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-pr276" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c67e3c27-01fd-47f5-a7fb-a2ae1e7753d4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5fe9713cb9b004911140ef4802a6755a4d2a781fde439d7318ace93e4b608cef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c844q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T12:59:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-pr276\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:25Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:26 crc kubenswrapper[4724]: I1002 12:59:26.112814 4724 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 02 12:59:26 crc kubenswrapper[4724]: I1002 12:59:26.115436 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 12:59:26 crc kubenswrapper[4724]: I1002 12:59:26.115473 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 12:59:26 crc kubenswrapper[4724]: I1002 12:59:26.115484 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 12:59:26 crc kubenswrapper[4724]: I1002 12:59:26.115618 4724 kubelet_node_status.go:76] "Attempting to register node" node="crc" Oct 02 12:59:26 crc kubenswrapper[4724]: I1002 12:59:26.122919 4724 kubelet_node_status.go:115] "Node was previously registered" node="crc" Oct 02 12:59:26 crc kubenswrapper[4724]: I1002 12:59:26.123230 4724 kubelet_node_status.go:79] "Successfully registered node" node="crc" Oct 02 12:59:26 crc kubenswrapper[4724]: I1002 12:59:26.124399 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 12:59:26 crc kubenswrapper[4724]: I1002 12:59:26.124432 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 12:59:26 crc kubenswrapper[4724]: I1002 12:59:26.124442 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 12:59:26 crc kubenswrapper[4724]: I1002 12:59:26.124459 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 12:59:26 crc kubenswrapper[4724]: I1002 12:59:26.124472 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T12:59:26Z","lastTransitionTime":"2025-10-02T12:59:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 12:59:26 crc kubenswrapper[4724]: E1002 12:59:26.144279 4724 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T12:59:26Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:26Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T12:59:26Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:26Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T12:59:26Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:26Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T12:59:26Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:26Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"1650a7ad-7086-4fb1-9f9b-b9368db16424\\\",\\\"systemUUID\\\":\\\"5e560baf-345b-4d65-984c-1cfbf6a74dd2\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:26Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:26 crc kubenswrapper[4724]: I1002 12:59:26.147824 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 12:59:26 crc kubenswrapper[4724]: I1002 12:59:26.147864 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 12:59:26 crc kubenswrapper[4724]: I1002 12:59:26.147873 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 12:59:26 crc kubenswrapper[4724]: I1002 12:59:26.147890 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 12:59:26 crc kubenswrapper[4724]: I1002 12:59:26.147902 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T12:59:26Z","lastTransitionTime":"2025-10-02T12:59:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 12:59:26 crc kubenswrapper[4724]: E1002 12:59:26.159311 4724 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T12:59:26Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:26Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T12:59:26Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:26Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T12:59:26Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:26Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T12:59:26Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:26Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"1650a7ad-7086-4fb1-9f9b-b9368db16424\\\",\\\"systemUUID\\\":\\\"5e560baf-345b-4d65-984c-1cfbf6a74dd2\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:26Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:26 crc kubenswrapper[4724]: I1002 12:59:26.163297 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 12:59:26 crc kubenswrapper[4724]: I1002 12:59:26.163333 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 12:59:26 crc kubenswrapper[4724]: I1002 12:59:26.163342 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 12:59:26 crc kubenswrapper[4724]: I1002 12:59:26.163360 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 12:59:26 crc kubenswrapper[4724]: I1002 12:59:26.163370 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T12:59:26Z","lastTransitionTime":"2025-10-02T12:59:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 12:59:26 crc kubenswrapper[4724]: E1002 12:59:26.175207 4724 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T12:59:26Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:26Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T12:59:26Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:26Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T12:59:26Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:26Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T12:59:26Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:26Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"1650a7ad-7086-4fb1-9f9b-b9368db16424\\\",\\\"systemUUID\\\":\\\"5e560baf-345b-4d65-984c-1cfbf6a74dd2\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:26Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:26 crc kubenswrapper[4724]: I1002 12:59:26.179516 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 12:59:26 crc kubenswrapper[4724]: I1002 12:59:26.179580 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 12:59:26 crc kubenswrapper[4724]: I1002 12:59:26.179595 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 12:59:26 crc kubenswrapper[4724]: I1002 12:59:26.179621 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 12:59:26 crc kubenswrapper[4724]: I1002 12:59:26.179635 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T12:59:26Z","lastTransitionTime":"2025-10-02T12:59:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 12:59:26 crc kubenswrapper[4724]: E1002 12:59:26.191818 4724 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T12:59:26Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:26Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T12:59:26Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:26Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T12:59:26Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:26Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T12:59:26Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:26Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"1650a7ad-7086-4fb1-9f9b-b9368db16424\\\",\\\"systemUUID\\\":\\\"5e560baf-345b-4d65-984c-1cfbf6a74dd2\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:26Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:26 crc kubenswrapper[4724]: I1002 12:59:26.197286 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 12:59:26 crc kubenswrapper[4724]: I1002 12:59:26.197320 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 12:59:26 crc kubenswrapper[4724]: I1002 12:59:26.197331 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 12:59:26 crc kubenswrapper[4724]: I1002 12:59:26.197346 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 12:59:26 crc kubenswrapper[4724]: I1002 12:59:26.197355 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T12:59:26Z","lastTransitionTime":"2025-10-02T12:59:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 12:59:26 crc kubenswrapper[4724]: E1002 12:59:26.214934 4724 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T12:59:26Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:26Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T12:59:26Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:26Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T12:59:26Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:26Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T12:59:26Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:26Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"1650a7ad-7086-4fb1-9f9b-b9368db16424\\\",\\\"systemUUID\\\":\\\"5e560baf-345b-4d65-984c-1cfbf6a74dd2\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:26Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:26 crc kubenswrapper[4724]: E1002 12:59:26.215116 4724 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Oct 02 12:59:26 crc kubenswrapper[4724]: I1002 12:59:26.216868 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 12:59:26 crc kubenswrapper[4724]: I1002 12:59:26.216965 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 12:59:26 crc kubenswrapper[4724]: I1002 12:59:26.217027 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 12:59:26 crc kubenswrapper[4724]: I1002 12:59:26.217090 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 12:59:26 crc kubenswrapper[4724]: I1002 12:59:26.217144 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T12:59:26Z","lastTransitionTime":"2025-10-02T12:59:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 12:59:26 crc kubenswrapper[4724]: I1002 12:59:26.326647 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 12:59:26 crc kubenswrapper[4724]: I1002 12:59:26.326697 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 12:59:26 crc kubenswrapper[4724]: I1002 12:59:26.326706 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 12:59:26 crc kubenswrapper[4724]: I1002 12:59:26.326723 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 12:59:26 crc kubenswrapper[4724]: I1002 12:59:26.326734 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T12:59:26Z","lastTransitionTime":"2025-10-02T12:59:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 12:59:26 crc kubenswrapper[4724]: I1002 12:59:26.330141 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-74k4t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6090eaa-c182-4788-950c-16352c271233\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a3396cd890f4ef8af351a4a3065c2324c20bddc7e079bb78e5a02e9fc8269c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fb6lr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4d5dbca156adf53ca2d3b237e15b6dd4387249e73dbf1a7fb3e7e39c53d1b548\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fb6lr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T12:59:21Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-74k4t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:26Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:26 crc kubenswrapper[4724]: I1002 12:59:26.345340 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df76441bbfa05afa242a77e10e6f855d811a9432b790630a9fa5f359ccb6d457\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:26Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:26 crc kubenswrapper[4724]: I1002 12:59:26.360094 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://16ca55ea7e67fb6219b1c835d960d1f9ec6a18a98789ee961e44a278fad9add7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://72eefb1c1aad15d7ad34aa4384c7d0a20a4c886225cb7d28f0597cf01ad35693\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:26Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:26 crc kubenswrapper[4724]: I1002 12:59:26.375460 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:19Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:26Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:26 crc kubenswrapper[4724]: I1002 12:59:26.391252 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-829dv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b90b9e02-3565-4ad5-8f8c-eec339fc499c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:21Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vjwcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://55a18158dd67ef4667e4a6bc9684c4f4824f848b6e151a0f99c73e5ecfcc5ab1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://55a18158dd67ef4667e4a6bc9684c4f4824f848b6e151a0f99c73e5ecfcc5ab1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T12:59:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T12:59:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vjwcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://db711940b17fde8f8a00b44de8366688eb730facc06a174bf09456c12c673eb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://db711940b17fde8f8a00b44de8366688eb730facc06a174bf09456c12c673eb0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T12:59:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T12:59:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vjwcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vjwcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vjwcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vjwcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vjwcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T12:59:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-829dv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:26Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:26 crc kubenswrapper[4724]: I1002 12:59:26.412980 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1527f3c8-7fa1-404b-aafd-7cac512df49d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:58:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:58:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:58:56Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:58:56Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:58:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f058a4a1cf35b7f4800f872139322f2b0096d67548945ba03ecfe6775ffd5103\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:58:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c06801afff9aa03ae9e0bcd8a966f454d81fdfe876806eed22e9d93132989dae\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:58:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff4c45bdd946a4d50c7dc93752d3d1a71b10b830611b64e138a5e1c0eb9e5e30\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:58:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a3585f00b9d5a19821986d13b0d46f0db3bdd6eef3b4e28f3a63449e0aa0e55\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a3585f00b9d5a19821986d13b0d46f0db3bdd6eef3b4e28f3a63449e0aa0e55\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-02T12:59:19Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1002 12:59:13.975124 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1002 12:59:13.976423 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1894517521/tls.crt::/tmp/serving-cert-1894517521/tls.key\\\\\\\"\\\\nI1002 12:59:19.332110 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1002 12:59:19.335107 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1002 12:59:19.335132 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1002 12:59:19.335161 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1002 12:59:19.335167 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1002 12:59:19.339404 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1002 12:59:19.339433 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 12:59:19.339437 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 12:59:19.339442 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1002 12:59:19.339446 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1002 12:59:19.339452 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1002 12:59:19.339454 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1002 12:59:19.339646 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1002 12:59:19.342508 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T12:59:02Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://25e74bd1660931f7bb8dc824f985c539abcbd4c706a11edf7192876b8796e751\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:58:59Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://366ed0b9a29ba0b7606888087909b51b39abd60dd6b11e6509b8613913461b6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://366ed0b9a29ba0b7606888087909b51b39abd60dd6b11e6509b8613913461b6b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T12:58:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T12:58:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T12:58:56Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:26Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:26 crc kubenswrapper[4724]: I1002 12:59:26.429231 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 12:59:26 crc kubenswrapper[4724]: I1002 12:59:26.429279 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 12:59:26 crc kubenswrapper[4724]: I1002 12:59:26.429290 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 12:59:26 crc kubenswrapper[4724]: I1002 12:59:26.429311 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 12:59:26 crc kubenswrapper[4724]: I1002 12:59:26.429324 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T12:59:26Z","lastTransitionTime":"2025-10-02T12:59:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 12:59:26 crc kubenswrapper[4724]: I1002 12:59:26.431403 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-w58lt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4089ad23-969c-4222-a8ed-e141ec291e80\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sxqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sxqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sxqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sxqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sxqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sxqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sxqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sxqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9c7841b20007f0436754e669fb6f1a85de1f0ced1c0b2686ae0a20b56ab878f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9c7841b20007f0436754e669fb6f1a85de1f0ced1c0b2686ae0a20b56ab878f2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T12:59:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T12:59:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sxqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T12:59:21Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-w58lt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:26Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:26 crc kubenswrapper[4724]: I1002 12:59:26.445306 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8902618f9e9502ff68d0658e199abb68025c551774c41022ed9f3f15dcccba47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:26Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:26 crc kubenswrapper[4724]: I1002 12:59:26.446846 4724 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/node-ca-vv7gr"] Oct 02 12:59:26 crc kubenswrapper[4724]: I1002 12:59:26.447309 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-vv7gr" Oct 02 12:59:26 crc kubenswrapper[4724]: I1002 12:59:26.451007 4724 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Oct 02 12:59:26 crc kubenswrapper[4724]: I1002 12:59:26.451189 4724 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Oct 02 12:59:26 crc kubenswrapper[4724]: I1002 12:59:26.453372 4724 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Oct 02 12:59:26 crc kubenswrapper[4724]: I1002 12:59:26.453733 4724 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Oct 02 12:59:26 crc kubenswrapper[4724]: I1002 12:59:26.460200 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:26Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:26 crc kubenswrapper[4724]: I1002 12:59:26.473104 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:19Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:26Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:26 crc kubenswrapper[4724]: I1002 12:59:26.484759 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-pr276" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c67e3c27-01fd-47f5-a7fb-a2ae1e7753d4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5fe9713cb9b004911140ef4802a6755a4d2a781fde439d7318ace93e4b608cef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c844q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T12:59:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-pr276\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:26Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:26 crc kubenswrapper[4724]: I1002 12:59:26.497177 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"92a0e520-2ed8-4d81-8de4-c0f98916b012\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:58:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:58:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:58:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8d0d24fc1f56963b40bca48e3e923a5b35f9bec48f0b4fabbbdc9163762b2aa6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:58:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cdf23aa5cabd12ba0af299a7184df0341b5a7f7fb6f8159a10c932599fcef08b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:58:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a087f1c52a561564ac405ef652e8cb1542077e507acff22e1189f7370b7ca322\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:58:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a675aca2218d07044b6e9a391f2b60f5cc822790011af97b0c0b704c1bd98d78\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:58:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T12:58:56Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:26Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:26 crc kubenswrapper[4724]: I1002 12:59:26.508710 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-2mrjk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e0d209f5-ad55-48f5-b4de-51aa5a972c19\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e94572c741d70917793b2482faf4c7fba57dfc4a29ade8824b35109735b602c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8f48\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T12:59:21Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-2mrjk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:26Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:26 crc kubenswrapper[4724]: I1002 12:59:26.512911 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4spmh\" (UniqueName: \"kubernetes.io/projected/58e6c559-83ff-48ec-b337-ddd00852bc3c-kube-api-access-4spmh\") pod \"node-ca-vv7gr\" (UID: \"58e6c559-83ff-48ec-b337-ddd00852bc3c\") " pod="openshift-image-registry/node-ca-vv7gr" Oct 02 12:59:26 crc kubenswrapper[4724]: I1002 12:59:26.513003 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/58e6c559-83ff-48ec-b337-ddd00852bc3c-serviceca\") pod \"node-ca-vv7gr\" (UID: \"58e6c559-83ff-48ec-b337-ddd00852bc3c\") " pod="openshift-image-registry/node-ca-vv7gr" Oct 02 12:59:26 crc kubenswrapper[4724]: I1002 12:59:26.513057 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/58e6c559-83ff-48ec-b337-ddd00852bc3c-host\") pod \"node-ca-vv7gr\" (UID: \"58e6c559-83ff-48ec-b337-ddd00852bc3c\") " pod="openshift-image-registry/node-ca-vv7gr" Oct 02 12:59:26 crc kubenswrapper[4724]: I1002 12:59:26.524018 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1527f3c8-7fa1-404b-aafd-7cac512df49d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:58:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:58:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:58:56Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:58:56Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:58:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f058a4a1cf35b7f4800f872139322f2b0096d67548945ba03ecfe6775ffd5103\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:58:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c06801afff9aa03ae9e0bcd8a966f454d81fdfe876806eed22e9d93132989dae\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:58:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff4c45bdd946a4d50c7dc93752d3d1a71b10b830611b64e138a5e1c0eb9e5e30\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:58:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a3585f00b9d5a19821986d13b0d46f0db3bdd6eef3b4e28f3a63449e0aa0e55\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a3585f00b9d5a19821986d13b0d46f0db3bdd6eef3b4e28f3a63449e0aa0e55\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-02T12:59:19Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1002 12:59:13.975124 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1002 12:59:13.976423 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1894517521/tls.crt::/tmp/serving-cert-1894517521/tls.key\\\\\\\"\\\\nI1002 12:59:19.332110 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1002 12:59:19.335107 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1002 12:59:19.335132 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1002 12:59:19.335161 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1002 12:59:19.335167 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1002 12:59:19.339404 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1002 12:59:19.339433 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 12:59:19.339437 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 12:59:19.339442 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1002 12:59:19.339446 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1002 12:59:19.339452 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1002 12:59:19.339454 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1002 12:59:19.339646 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1002 12:59:19.342508 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T12:59:02Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://25e74bd1660931f7bb8dc824f985c539abcbd4c706a11edf7192876b8796e751\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:58:59Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://366ed0b9a29ba0b7606888087909b51b39abd60dd6b11e6509b8613913461b6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://366ed0b9a29ba0b7606888087909b51b39abd60dd6b11e6509b8613913461b6b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T12:58:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T12:58:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T12:58:56Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:26Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:26 crc kubenswrapper[4724]: I1002 12:59:26.531675 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 12:59:26 crc kubenswrapper[4724]: I1002 12:59:26.531709 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 12:59:26 crc kubenswrapper[4724]: I1002 12:59:26.531719 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 12:59:26 crc kubenswrapper[4724]: I1002 12:59:26.531733 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 12:59:26 crc kubenswrapper[4724]: I1002 12:59:26.531744 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T12:59:26Z","lastTransitionTime":"2025-10-02T12:59:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 12:59:26 crc kubenswrapper[4724]: I1002 12:59:26.544863 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-w58lt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4089ad23-969c-4222-a8ed-e141ec291e80\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sxqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sxqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sxqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sxqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sxqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sxqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sxqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sxqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9c7841b20007f0436754e669fb6f1a85de1f0ced1c0b2686ae0a20b56ab878f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9c7841b20007f0436754e669fb6f1a85de1f0ced1c0b2686ae0a20b56ab878f2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T12:59:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T12:59:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sxqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T12:59:21Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-w58lt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:26Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:26 crc kubenswrapper[4724]: I1002 12:59:26.555919 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-vv7gr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"58e6c559-83ff-48ec-b337-ddd00852bc3c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:26Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:26Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4spmh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T12:59:26Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-vv7gr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:26Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:26 crc kubenswrapper[4724]: I1002 12:59:26.569301 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8902618f9e9502ff68d0658e199abb68025c551774c41022ed9f3f15dcccba47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:26Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:26 crc kubenswrapper[4724]: I1002 12:59:26.574825 4724 generic.go:334] "Generic (PLEG): container finished" podID="b90b9e02-3565-4ad5-8f8c-eec339fc499c" containerID="9a04ebb8ce267454f3068833092d9644e22484d6d3ea6c89e20fe396e5ea1fe9" exitCode=0 Oct 02 12:59:26 crc kubenswrapper[4724]: I1002 12:59:26.574875 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-829dv" event={"ID":"b90b9e02-3565-4ad5-8f8c-eec339fc499c","Type":"ContainerDied","Data":"9a04ebb8ce267454f3068833092d9644e22484d6d3ea6c89e20fe396e5ea1fe9"} Oct 02 12:59:26 crc kubenswrapper[4724]: I1002 12:59:26.585003 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:26Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:26 crc kubenswrapper[4724]: I1002 12:59:26.599218 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:19Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:26Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:26 crc kubenswrapper[4724]: I1002 12:59:26.614061 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4spmh\" (UniqueName: \"kubernetes.io/projected/58e6c559-83ff-48ec-b337-ddd00852bc3c-kube-api-access-4spmh\") pod \"node-ca-vv7gr\" (UID: \"58e6c559-83ff-48ec-b337-ddd00852bc3c\") " pod="openshift-image-registry/node-ca-vv7gr" Oct 02 12:59:26 crc kubenswrapper[4724]: I1002 12:59:26.614162 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/58e6c559-83ff-48ec-b337-ddd00852bc3c-serviceca\") pod \"node-ca-vv7gr\" (UID: \"58e6c559-83ff-48ec-b337-ddd00852bc3c\") " pod="openshift-image-registry/node-ca-vv7gr" Oct 02 12:59:26 crc kubenswrapper[4724]: I1002 12:59:26.614282 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/58e6c559-83ff-48ec-b337-ddd00852bc3c-host\") pod \"node-ca-vv7gr\" (UID: \"58e6c559-83ff-48ec-b337-ddd00852bc3c\") " pod="openshift-image-registry/node-ca-vv7gr" Oct 02 12:59:26 crc kubenswrapper[4724]: I1002 12:59:26.614341 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/58e6c559-83ff-48ec-b337-ddd00852bc3c-host\") pod \"node-ca-vv7gr\" (UID: \"58e6c559-83ff-48ec-b337-ddd00852bc3c\") " pod="openshift-image-registry/node-ca-vv7gr" Oct 02 12:59:26 crc kubenswrapper[4724]: I1002 12:59:26.617162 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/58e6c559-83ff-48ec-b337-ddd00852bc3c-serviceca\") pod \"node-ca-vv7gr\" (UID: \"58e6c559-83ff-48ec-b337-ddd00852bc3c\") " pod="openshift-image-registry/node-ca-vv7gr" Oct 02 12:59:26 crc kubenswrapper[4724]: I1002 12:59:26.619281 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-pr276" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c67e3c27-01fd-47f5-a7fb-a2ae1e7753d4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5fe9713cb9b004911140ef4802a6755a4d2a781fde439d7318ace93e4b608cef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c844q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T12:59:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-pr276\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:26Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:26 crc kubenswrapper[4724]: I1002 12:59:26.631730 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"92a0e520-2ed8-4d81-8de4-c0f98916b012\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:58:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:58:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:58:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8d0d24fc1f56963b40bca48e3e923a5b35f9bec48f0b4fabbbdc9163762b2aa6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:58:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cdf23aa5cabd12ba0af299a7184df0341b5a7f7fb6f8159a10c932599fcef08b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:58:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a087f1c52a561564ac405ef652e8cb1542077e507acff22e1189f7370b7ca322\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:58:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a675aca2218d07044b6e9a391f2b60f5cc822790011af97b0c0b704c1bd98d78\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:58:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T12:58:56Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:26Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:26 crc kubenswrapper[4724]: I1002 12:59:26.634383 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 12:59:26 crc kubenswrapper[4724]: I1002 12:59:26.634418 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 12:59:26 crc kubenswrapper[4724]: I1002 12:59:26.634430 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 12:59:26 crc kubenswrapper[4724]: I1002 12:59:26.634448 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 12:59:26 crc kubenswrapper[4724]: I1002 12:59:26.634461 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T12:59:26Z","lastTransitionTime":"2025-10-02T12:59:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 12:59:26 crc kubenswrapper[4724]: I1002 12:59:26.637824 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4spmh\" (UniqueName: \"kubernetes.io/projected/58e6c559-83ff-48ec-b337-ddd00852bc3c-kube-api-access-4spmh\") pod \"node-ca-vv7gr\" (UID: \"58e6c559-83ff-48ec-b337-ddd00852bc3c\") " pod="openshift-image-registry/node-ca-vv7gr" Oct 02 12:59:26 crc kubenswrapper[4724]: I1002 12:59:26.643992 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-2mrjk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e0d209f5-ad55-48f5-b4de-51aa5a972c19\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e94572c741d70917793b2482faf4c7fba57dfc4a29ade8824b35109735b602c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8f48\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T12:59:21Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-2mrjk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:26Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:26 crc kubenswrapper[4724]: I1002 12:59:26.658843 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-74k4t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6090eaa-c182-4788-950c-16352c271233\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a3396cd890f4ef8af351a4a3065c2324c20bddc7e079bb78e5a02e9fc8269c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fb6lr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4d5dbca156adf53ca2d3b237e15b6dd4387249e73dbf1a7fb3e7e39c53d1b548\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fb6lr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T12:59:21Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-74k4t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:26Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:26 crc kubenswrapper[4724]: I1002 12:59:26.676576 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df76441bbfa05afa242a77e10e6f855d811a9432b790630a9fa5f359ccb6d457\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:26Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:26 crc kubenswrapper[4724]: I1002 12:59:26.692525 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://16ca55ea7e67fb6219b1c835d960d1f9ec6a18a98789ee961e44a278fad9add7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://72eefb1c1aad15d7ad34aa4384c7d0a20a4c886225cb7d28f0597cf01ad35693\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:26Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:26 crc kubenswrapper[4724]: I1002 12:59:26.706466 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:19Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:26Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:26 crc kubenswrapper[4724]: I1002 12:59:26.722295 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-829dv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b90b9e02-3565-4ad5-8f8c-eec339fc499c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:21Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vjwcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://55a18158dd67ef4667e4a6bc9684c4f4824f848b6e151a0f99c73e5ecfcc5ab1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://55a18158dd67ef4667e4a6bc9684c4f4824f848b6e151a0f99c73e5ecfcc5ab1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T12:59:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T12:59:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vjwcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://db711940b17fde8f8a00b44de8366688eb730facc06a174bf09456c12c673eb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://db711940b17fde8f8a00b44de8366688eb730facc06a174bf09456c12c673eb0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T12:59:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T12:59:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vjwcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vjwcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vjwcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vjwcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vjwcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T12:59:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-829dv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:26Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:26 crc kubenswrapper[4724]: I1002 12:59:26.739906 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1527f3c8-7fa1-404b-aafd-7cac512df49d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:58:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:58:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:58:56Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:58:56Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:58:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f058a4a1cf35b7f4800f872139322f2b0096d67548945ba03ecfe6775ffd5103\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:58:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c06801afff9aa03ae9e0bcd8a966f454d81fdfe876806eed22e9d93132989dae\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:58:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff4c45bdd946a4d50c7dc93752d3d1a71b10b830611b64e138a5e1c0eb9e5e30\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:58:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a3585f00b9d5a19821986d13b0d46f0db3bdd6eef3b4e28f3a63449e0aa0e55\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a3585f00b9d5a19821986d13b0d46f0db3bdd6eef3b4e28f3a63449e0aa0e55\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-02T12:59:19Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1002 12:59:13.975124 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1002 12:59:13.976423 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1894517521/tls.crt::/tmp/serving-cert-1894517521/tls.key\\\\\\\"\\\\nI1002 12:59:19.332110 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1002 12:59:19.335107 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1002 12:59:19.335132 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1002 12:59:19.335161 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1002 12:59:19.335167 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1002 12:59:19.339404 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1002 12:59:19.339433 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 12:59:19.339437 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 12:59:19.339442 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1002 12:59:19.339446 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1002 12:59:19.339452 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1002 12:59:19.339454 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1002 12:59:19.339646 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1002 12:59:19.342508 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T12:59:02Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://25e74bd1660931f7bb8dc824f985c539abcbd4c706a11edf7192876b8796e751\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:58:59Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://366ed0b9a29ba0b7606888087909b51b39abd60dd6b11e6509b8613913461b6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://366ed0b9a29ba0b7606888087909b51b39abd60dd6b11e6509b8613913461b6b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T12:58:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T12:58:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T12:58:56Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:26Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:26 crc kubenswrapper[4724]: I1002 12:59:26.740370 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 12:59:26 crc kubenswrapper[4724]: I1002 12:59:26.740435 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 12:59:26 crc kubenswrapper[4724]: I1002 12:59:26.740449 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 12:59:26 crc kubenswrapper[4724]: I1002 12:59:26.740473 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 12:59:26 crc kubenswrapper[4724]: I1002 12:59:26.740493 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T12:59:26Z","lastTransitionTime":"2025-10-02T12:59:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 12:59:26 crc kubenswrapper[4724]: I1002 12:59:26.758625 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-w58lt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4089ad23-969c-4222-a8ed-e141ec291e80\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sxqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sxqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sxqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sxqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sxqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sxqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sxqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sxqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9c7841b20007f0436754e669fb6f1a85de1f0ced1c0b2686ae0a20b56ab878f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9c7841b20007f0436754e669fb6f1a85de1f0ced1c0b2686ae0a20b56ab878f2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T12:59:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T12:59:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sxqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T12:59:21Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-w58lt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:26Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:26 crc kubenswrapper[4724]: I1002 12:59:26.762919 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-vv7gr" Oct 02 12:59:26 crc kubenswrapper[4724]: I1002 12:59:26.774227 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8902618f9e9502ff68d0658e199abb68025c551774c41022ed9f3f15dcccba47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:26Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:26 crc kubenswrapper[4724]: W1002 12:59:26.776642 4724 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod58e6c559_83ff_48ec_b337_ddd00852bc3c.slice/crio-2b0afae79f0a20808977eadd8654479374d98984ba108599ec4d9e9392f3b4fb WatchSource:0}: Error finding container 2b0afae79f0a20808977eadd8654479374d98984ba108599ec4d9e9392f3b4fb: Status 404 returned error can't find the container with id 2b0afae79f0a20808977eadd8654479374d98984ba108599ec4d9e9392f3b4fb Oct 02 12:59:26 crc kubenswrapper[4724]: I1002 12:59:26.789212 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:26Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:26 crc kubenswrapper[4724]: I1002 12:59:26.803824 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:19Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:26Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:26 crc kubenswrapper[4724]: I1002 12:59:26.818449 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-pr276" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c67e3c27-01fd-47f5-a7fb-a2ae1e7753d4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5fe9713cb9b004911140ef4802a6755a4d2a781fde439d7318ace93e4b608cef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c844q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T12:59:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-pr276\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:26Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:26 crc kubenswrapper[4724]: I1002 12:59:26.830806 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-vv7gr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"58e6c559-83ff-48ec-b337-ddd00852bc3c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:26Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:26Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4spmh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T12:59:26Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-vv7gr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:26Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:26 crc kubenswrapper[4724]: I1002 12:59:26.842871 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 12:59:26 crc kubenswrapper[4724]: I1002 12:59:26.842919 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 12:59:26 crc kubenswrapper[4724]: I1002 12:59:26.842929 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 12:59:26 crc kubenswrapper[4724]: I1002 12:59:26.842949 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 12:59:26 crc kubenswrapper[4724]: I1002 12:59:26.842960 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T12:59:26Z","lastTransitionTime":"2025-10-02T12:59:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 12:59:26 crc kubenswrapper[4724]: I1002 12:59:26.865591 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"92a0e520-2ed8-4d81-8de4-c0f98916b012\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:58:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:58:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:58:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8d0d24fc1f56963b40bca48e3e923a5b35f9bec48f0b4fabbbdc9163762b2aa6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:58:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cdf23aa5cabd12ba0af299a7184df0341b5a7f7fb6f8159a10c932599fcef08b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:58:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a087f1c52a561564ac405ef652e8cb1542077e507acff22e1189f7370b7ca322\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:58:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a675aca2218d07044b6e9a391f2b60f5cc822790011af97b0c0b704c1bd98d78\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:58:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T12:58:56Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:26Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:26 crc kubenswrapper[4724]: I1002 12:59:26.903094 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-2mrjk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e0d209f5-ad55-48f5-b4de-51aa5a972c19\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e94572c741d70917793b2482faf4c7fba57dfc4a29ade8824b35109735b602c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8f48\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T12:59:21Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-2mrjk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:26Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:26 crc kubenswrapper[4724]: I1002 12:59:26.916873 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 12:59:26 crc kubenswrapper[4724]: I1002 12:59:26.916971 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 12:59:26 crc kubenswrapper[4724]: E1002 12:59:26.917016 4724 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 12:59:34.916997916 +0000 UTC m=+39.371757037 (durationBeforeRetry 8s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 12:59:26 crc kubenswrapper[4724]: E1002 12:59:26.917045 4724 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Oct 02 12:59:26 crc kubenswrapper[4724]: E1002 12:59:26.917089 4724 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-02 12:59:34.917080548 +0000 UTC m=+39.371839669 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Oct 02 12:59:26 crc kubenswrapper[4724]: I1002 12:59:26.944919 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df76441bbfa05afa242a77e10e6f855d811a9432b790630a9fa5f359ccb6d457\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:26Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:26 crc kubenswrapper[4724]: I1002 12:59:26.946846 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 12:59:26 crc kubenswrapper[4724]: I1002 12:59:26.946891 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 12:59:26 crc kubenswrapper[4724]: I1002 12:59:26.946903 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 12:59:26 crc kubenswrapper[4724]: I1002 12:59:26.946921 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 12:59:26 crc kubenswrapper[4724]: I1002 12:59:26.946935 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T12:59:26Z","lastTransitionTime":"2025-10-02T12:59:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 12:59:26 crc kubenswrapper[4724]: I1002 12:59:26.984657 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://16ca55ea7e67fb6219b1c835d960d1f9ec6a18a98789ee961e44a278fad9add7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://72eefb1c1aad15d7ad34aa4384c7d0a20a4c886225cb7d28f0597cf01ad35693\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:26Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:27 crc kubenswrapper[4724]: I1002 12:59:27.010402 4724 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 02 12:59:27 crc kubenswrapper[4724]: I1002 12:59:27.011322 4724 scope.go:117] "RemoveContainer" containerID="8a3585f00b9d5a19821986d13b0d46f0db3bdd6eef3b4e28f3a63449e0aa0e55" Oct 02 12:59:27 crc kubenswrapper[4724]: E1002 12:59:27.011571 4724 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Oct 02 12:59:27 crc kubenswrapper[4724]: I1002 12:59:27.017921 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 12:59:27 crc kubenswrapper[4724]: I1002 12:59:27.017996 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 12:59:27 crc kubenswrapper[4724]: I1002 12:59:27.018022 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 12:59:27 crc kubenswrapper[4724]: E1002 12:59:27.018148 4724 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 02 12:59:27 crc kubenswrapper[4724]: E1002 12:59:27.018236 4724 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-02 12:59:35.018213587 +0000 UTC m=+39.472972708 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 02 12:59:27 crc kubenswrapper[4724]: E1002 12:59:27.018239 4724 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 02 12:59:27 crc kubenswrapper[4724]: E1002 12:59:27.018263 4724 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 02 12:59:27 crc kubenswrapper[4724]: E1002 12:59:27.018277 4724 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 02 12:59:27 crc kubenswrapper[4724]: E1002 12:59:27.018335 4724 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-10-02 12:59:35.01831532 +0000 UTC m=+39.473074611 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 02 12:59:27 crc kubenswrapper[4724]: E1002 12:59:27.018526 4724 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 02 12:59:27 crc kubenswrapper[4724]: E1002 12:59:27.018589 4724 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 02 12:59:27 crc kubenswrapper[4724]: E1002 12:59:27.018604 4724 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 02 12:59:27 crc kubenswrapper[4724]: E1002 12:59:27.018681 4724 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-10-02 12:59:35.018654369 +0000 UTC m=+39.473413490 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 02 12:59:27 crc kubenswrapper[4724]: I1002 12:59:27.024016 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:19Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:27Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:27 crc kubenswrapper[4724]: I1002 12:59:27.049526 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 12:59:27 crc kubenswrapper[4724]: I1002 12:59:27.049594 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 12:59:27 crc kubenswrapper[4724]: I1002 12:59:27.049609 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 12:59:27 crc kubenswrapper[4724]: I1002 12:59:27.049630 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 12:59:27 crc kubenswrapper[4724]: I1002 12:59:27.049642 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T12:59:27Z","lastTransitionTime":"2025-10-02T12:59:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 12:59:27 crc kubenswrapper[4724]: I1002 12:59:27.065200 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-829dv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b90b9e02-3565-4ad5-8f8c-eec339fc499c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:21Z\\\",\\\"message\\\":\\\"containers with incomplete status: [routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vjwcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://55a18158dd67ef4667e4a6bc9684c4f4824f848b6e151a0f99c73e5ecfcc5ab1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://55a18158dd67ef4667e4a6bc9684c4f4824f848b6e151a0f99c73e5ecfcc5ab1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T12:59:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T12:59:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vjwcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://db711940b17fde8f8a00b44de8366688eb730facc06a174bf09456c12c673eb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://db711940b17fde8f8a00b44de8366688eb730facc06a174bf09456c12c673eb0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T12:59:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T12:59:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vjwcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a04ebb8ce267454f3068833092d9644e22484d6d3ea6c89e20fe396e5ea1fe9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a04ebb8ce267454f3068833092d9644e22484d6d3ea6c89e20fe396e5ea1fe9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T12:59:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T12:59:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vjwcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vjwcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vjwcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vjwcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T12:59:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-829dv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:27Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:27 crc kubenswrapper[4724]: I1002 12:59:27.103279 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-74k4t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6090eaa-c182-4788-950c-16352c271233\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a3396cd890f4ef8af351a4a3065c2324c20bddc7e079bb78e5a02e9fc8269c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fb6lr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4d5dbca156adf53ca2d3b237e15b6dd4387249e73dbf1a7fb3e7e39c53d1b548\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fb6lr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T12:59:21Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-74k4t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:27Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:27 crc kubenswrapper[4724]: I1002 12:59:27.152898 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 12:59:27 crc kubenswrapper[4724]: I1002 12:59:27.152956 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 12:59:27 crc kubenswrapper[4724]: I1002 12:59:27.152977 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 12:59:27 crc kubenswrapper[4724]: I1002 12:59:27.153001 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 12:59:27 crc kubenswrapper[4724]: I1002 12:59:27.153013 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T12:59:27Z","lastTransitionTime":"2025-10-02T12:59:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 12:59:27 crc kubenswrapper[4724]: I1002 12:59:27.255655 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 12:59:27 crc kubenswrapper[4724]: I1002 12:59:27.255711 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 12:59:27 crc kubenswrapper[4724]: I1002 12:59:27.255723 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 12:59:27 crc kubenswrapper[4724]: I1002 12:59:27.255741 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 12:59:27 crc kubenswrapper[4724]: I1002 12:59:27.255753 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T12:59:27Z","lastTransitionTime":"2025-10-02T12:59:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 12:59:27 crc kubenswrapper[4724]: I1002 12:59:27.313038 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 12:59:27 crc kubenswrapper[4724]: I1002 12:59:27.313038 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 12:59:27 crc kubenswrapper[4724]: I1002 12:59:27.313038 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 12:59:27 crc kubenswrapper[4724]: E1002 12:59:27.313223 4724 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 02 12:59:27 crc kubenswrapper[4724]: E1002 12:59:27.313268 4724 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 02 12:59:27 crc kubenswrapper[4724]: E1002 12:59:27.313327 4724 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 02 12:59:27 crc kubenswrapper[4724]: I1002 12:59:27.359105 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 12:59:27 crc kubenswrapper[4724]: I1002 12:59:27.359147 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 12:59:27 crc kubenswrapper[4724]: I1002 12:59:27.359163 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 12:59:27 crc kubenswrapper[4724]: I1002 12:59:27.359180 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 12:59:27 crc kubenswrapper[4724]: I1002 12:59:27.359191 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T12:59:27Z","lastTransitionTime":"2025-10-02T12:59:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 12:59:27 crc kubenswrapper[4724]: I1002 12:59:27.461742 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 12:59:27 crc kubenswrapper[4724]: I1002 12:59:27.461790 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 12:59:27 crc kubenswrapper[4724]: I1002 12:59:27.461803 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 12:59:27 crc kubenswrapper[4724]: I1002 12:59:27.461820 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 12:59:27 crc kubenswrapper[4724]: I1002 12:59:27.461829 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T12:59:27Z","lastTransitionTime":"2025-10-02T12:59:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 12:59:27 crc kubenswrapper[4724]: I1002 12:59:27.564615 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 12:59:27 crc kubenswrapper[4724]: I1002 12:59:27.564705 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 12:59:27 crc kubenswrapper[4724]: I1002 12:59:27.564717 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 12:59:27 crc kubenswrapper[4724]: I1002 12:59:27.564737 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 12:59:27 crc kubenswrapper[4724]: I1002 12:59:27.564760 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T12:59:27Z","lastTransitionTime":"2025-10-02T12:59:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 12:59:27 crc kubenswrapper[4724]: I1002 12:59:27.579315 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-vv7gr" event={"ID":"58e6c559-83ff-48ec-b337-ddd00852bc3c","Type":"ContainerStarted","Data":"540350e6cb29bb395f31338d08626d3b110cc3adbe0df070045d178281d9c035"} Oct 02 12:59:27 crc kubenswrapper[4724]: I1002 12:59:27.579360 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-vv7gr" event={"ID":"58e6c559-83ff-48ec-b337-ddd00852bc3c","Type":"ContainerStarted","Data":"2b0afae79f0a20808977eadd8654479374d98984ba108599ec4d9e9392f3b4fb"} Oct 02 12:59:27 crc kubenswrapper[4724]: I1002 12:59:27.581679 4724 generic.go:334] "Generic (PLEG): container finished" podID="b90b9e02-3565-4ad5-8f8c-eec339fc499c" containerID="771dcee276aac3571cff4769668348ea4a10d51e792d1604a47c57edac172dec" exitCode=0 Oct 02 12:59:27 crc kubenswrapper[4724]: I1002 12:59:27.581730 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-829dv" event={"ID":"b90b9e02-3565-4ad5-8f8c-eec339fc499c","Type":"ContainerDied","Data":"771dcee276aac3571cff4769668348ea4a10d51e792d1604a47c57edac172dec"} Oct 02 12:59:27 crc kubenswrapper[4724]: I1002 12:59:27.593503 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8902618f9e9502ff68d0658e199abb68025c551774c41022ed9f3f15dcccba47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:27Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:27 crc kubenswrapper[4724]: I1002 12:59:27.635707 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:27Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:27 crc kubenswrapper[4724]: I1002 12:59:27.656565 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:19Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:27Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:27 crc kubenswrapper[4724]: I1002 12:59:27.668144 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 12:59:27 crc kubenswrapper[4724]: I1002 12:59:27.668182 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 12:59:27 crc kubenswrapper[4724]: I1002 12:59:27.668192 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 12:59:27 crc kubenswrapper[4724]: I1002 12:59:27.668208 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 12:59:27 crc kubenswrapper[4724]: I1002 12:59:27.668218 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T12:59:27Z","lastTransitionTime":"2025-10-02T12:59:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 12:59:27 crc kubenswrapper[4724]: I1002 12:59:27.678301 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-pr276" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c67e3c27-01fd-47f5-a7fb-a2ae1e7753d4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5fe9713cb9b004911140ef4802a6755a4d2a781fde439d7318ace93e4b608cef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c844q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T12:59:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-pr276\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:27Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:27 crc kubenswrapper[4724]: I1002 12:59:27.688797 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-vv7gr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"58e6c559-83ff-48ec-b337-ddd00852bc3c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://540350e6cb29bb395f31338d08626d3b110cc3adbe0df070045d178281d9c035\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4spmh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T12:59:26Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-vv7gr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:27Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:27 crc kubenswrapper[4724]: I1002 12:59:27.711619 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"92a0e520-2ed8-4d81-8de4-c0f98916b012\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:58:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:58:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:58:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8d0d24fc1f56963b40bca48e3e923a5b35f9bec48f0b4fabbbdc9163762b2aa6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:58:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cdf23aa5cabd12ba0af299a7184df0341b5a7f7fb6f8159a10c932599fcef08b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:58:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a087f1c52a561564ac405ef652e8cb1542077e507acff22e1189f7370b7ca322\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:58:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a675aca2218d07044b6e9a391f2b60f5cc822790011af97b0c0b704c1bd98d78\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:58:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T12:58:56Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:27Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:27 crc kubenswrapper[4724]: I1002 12:59:27.721958 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-2mrjk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e0d209f5-ad55-48f5-b4de-51aa5a972c19\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e94572c741d70917793b2482faf4c7fba57dfc4a29ade8824b35109735b602c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8f48\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T12:59:21Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-2mrjk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:27Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:27 crc kubenswrapper[4724]: I1002 12:59:27.733829 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df76441bbfa05afa242a77e10e6f855d811a9432b790630a9fa5f359ccb6d457\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:27Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:27 crc kubenswrapper[4724]: I1002 12:59:27.746469 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://16ca55ea7e67fb6219b1c835d960d1f9ec6a18a98789ee961e44a278fad9add7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://72eefb1c1aad15d7ad34aa4384c7d0a20a4c886225cb7d28f0597cf01ad35693\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:27Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:27 crc kubenswrapper[4724]: I1002 12:59:27.758737 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:19Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:27Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:27 crc kubenswrapper[4724]: I1002 12:59:27.770724 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 12:59:27 crc kubenswrapper[4724]: I1002 12:59:27.770786 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 12:59:27 crc kubenswrapper[4724]: I1002 12:59:27.770800 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 12:59:27 crc kubenswrapper[4724]: I1002 12:59:27.770825 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 12:59:27 crc kubenswrapper[4724]: I1002 12:59:27.770839 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T12:59:27Z","lastTransitionTime":"2025-10-02T12:59:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 12:59:27 crc kubenswrapper[4724]: I1002 12:59:27.774226 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-829dv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b90b9e02-3565-4ad5-8f8c-eec339fc499c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:21Z\\\",\\\"message\\\":\\\"containers with incomplete status: [routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vjwcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://55a18158dd67ef4667e4a6bc9684c4f4824f848b6e151a0f99c73e5ecfcc5ab1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://55a18158dd67ef4667e4a6bc9684c4f4824f848b6e151a0f99c73e5ecfcc5ab1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T12:59:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T12:59:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vjwcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://db711940b17fde8f8a00b44de8366688eb730facc06a174bf09456c12c673eb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://db711940b17fde8f8a00b44de8366688eb730facc06a174bf09456c12c673eb0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T12:59:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T12:59:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vjwcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a04ebb8ce267454f3068833092d9644e22484d6d3ea6c89e20fe396e5ea1fe9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a04ebb8ce267454f3068833092d9644e22484d6d3ea6c89e20fe396e5ea1fe9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T12:59:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T12:59:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vjwcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vjwcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vjwcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vjwcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T12:59:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-829dv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:27Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:27 crc kubenswrapper[4724]: I1002 12:59:27.786824 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-74k4t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6090eaa-c182-4788-950c-16352c271233\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a3396cd890f4ef8af351a4a3065c2324c20bddc7e079bb78e5a02e9fc8269c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fb6lr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4d5dbca156adf53ca2d3b237e15b6dd4387249e73dbf1a7fb3e7e39c53d1b548\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fb6lr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T12:59:21Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-74k4t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:27Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:27 crc kubenswrapper[4724]: I1002 12:59:27.800686 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1527f3c8-7fa1-404b-aafd-7cac512df49d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:58:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:58:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:58:56Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:58:56Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:58:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f058a4a1cf35b7f4800f872139322f2b0096d67548945ba03ecfe6775ffd5103\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:58:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c06801afff9aa03ae9e0bcd8a966f454d81fdfe876806eed22e9d93132989dae\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:58:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff4c45bdd946a4d50c7dc93752d3d1a71b10b830611b64e138a5e1c0eb9e5e30\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:58:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a3585f00b9d5a19821986d13b0d46f0db3bdd6eef3b4e28f3a63449e0aa0e55\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a3585f00b9d5a19821986d13b0d46f0db3bdd6eef3b4e28f3a63449e0aa0e55\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-02T12:59:19Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1002 12:59:13.975124 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1002 12:59:13.976423 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1894517521/tls.crt::/tmp/serving-cert-1894517521/tls.key\\\\\\\"\\\\nI1002 12:59:19.332110 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1002 12:59:19.335107 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1002 12:59:19.335132 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1002 12:59:19.335161 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1002 12:59:19.335167 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1002 12:59:19.339404 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1002 12:59:19.339433 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 12:59:19.339437 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 12:59:19.339442 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1002 12:59:19.339446 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1002 12:59:19.339452 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1002 12:59:19.339454 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1002 12:59:19.339646 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1002 12:59:19.342508 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T12:59:02Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://25e74bd1660931f7bb8dc824f985c539abcbd4c706a11edf7192876b8796e751\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:58:59Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://366ed0b9a29ba0b7606888087909b51b39abd60dd6b11e6509b8613913461b6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://366ed0b9a29ba0b7606888087909b51b39abd60dd6b11e6509b8613913461b6b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T12:58:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T12:58:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T12:58:56Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:27Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:27 crc kubenswrapper[4724]: I1002 12:59:27.822438 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-w58lt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4089ad23-969c-4222-a8ed-e141ec291e80\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sxqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sxqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sxqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sxqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sxqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sxqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sxqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sxqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9c7841b20007f0436754e669fb6f1a85de1f0ced1c0b2686ae0a20b56ab878f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9c7841b20007f0436754e669fb6f1a85de1f0ced1c0b2686ae0a20b56ab878f2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T12:59:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T12:59:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sxqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T12:59:21Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-w58lt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:27Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:27 crc kubenswrapper[4724]: I1002 12:59:27.839246 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-829dv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b90b9e02-3565-4ad5-8f8c-eec339fc499c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:21Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vjwcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://55a18158dd67ef4667e4a6bc9684c4f4824f848b6e151a0f99c73e5ecfcc5ab1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://55a18158dd67ef4667e4a6bc9684c4f4824f848b6e151a0f99c73e5ecfcc5ab1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T12:59:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T12:59:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vjwcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://db711940b17fde8f8a00b44de8366688eb730facc06a174bf09456c12c673eb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://db711940b17fde8f8a00b44de8366688eb730facc06a174bf09456c12c673eb0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T12:59:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T12:59:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vjwcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a04ebb8ce267454f3068833092d9644e22484d6d3ea6c89e20fe396e5ea1fe9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a04ebb8ce267454f3068833092d9644e22484d6d3ea6c89e20fe396e5ea1fe9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T12:59:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T12:59:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vjwcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://771dcee276aac3571cff4769668348ea4a10d51e792d1604a47c57edac172dec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://771dcee276aac3571cff4769668348ea4a10d51e792d1604a47c57edac172dec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T12:59:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T12:59:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vjwcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vjwcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vjwcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T12:59:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-829dv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:27Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:27 crc kubenswrapper[4724]: I1002 12:59:27.853925 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-74k4t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6090eaa-c182-4788-950c-16352c271233\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a3396cd890f4ef8af351a4a3065c2324c20bddc7e079bb78e5a02e9fc8269c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fb6lr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4d5dbca156adf53ca2d3b237e15b6dd4387249e73dbf1a7fb3e7e39c53d1b548\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fb6lr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T12:59:21Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-74k4t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:27Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:27 crc kubenswrapper[4724]: I1002 12:59:27.866682 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df76441bbfa05afa242a77e10e6f855d811a9432b790630a9fa5f359ccb6d457\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:27Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:27 crc kubenswrapper[4724]: I1002 12:59:27.874023 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 12:59:27 crc kubenswrapper[4724]: I1002 12:59:27.874090 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 12:59:27 crc kubenswrapper[4724]: I1002 12:59:27.874103 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 12:59:27 crc kubenswrapper[4724]: I1002 12:59:27.874125 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 12:59:27 crc kubenswrapper[4724]: I1002 12:59:27.874137 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T12:59:27Z","lastTransitionTime":"2025-10-02T12:59:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 12:59:27 crc kubenswrapper[4724]: I1002 12:59:27.882180 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://16ca55ea7e67fb6219b1c835d960d1f9ec6a18a98789ee961e44a278fad9add7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://72eefb1c1aad15d7ad34aa4384c7d0a20a4c886225cb7d28f0597cf01ad35693\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:27Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:27 crc kubenswrapper[4724]: I1002 12:59:27.898191 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:19Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:27Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:27 crc kubenswrapper[4724]: I1002 12:59:27.916656 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1527f3c8-7fa1-404b-aafd-7cac512df49d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:58:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:58:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:58:56Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:58:56Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:58:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f058a4a1cf35b7f4800f872139322f2b0096d67548945ba03ecfe6775ffd5103\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:58:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c06801afff9aa03ae9e0bcd8a966f454d81fdfe876806eed22e9d93132989dae\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:58:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff4c45bdd946a4d50c7dc93752d3d1a71b10b830611b64e138a5e1c0eb9e5e30\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:58:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a3585f00b9d5a19821986d13b0d46f0db3bdd6eef3b4e28f3a63449e0aa0e55\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a3585f00b9d5a19821986d13b0d46f0db3bdd6eef3b4e28f3a63449e0aa0e55\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-02T12:59:19Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1002 12:59:13.975124 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1002 12:59:13.976423 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1894517521/tls.crt::/tmp/serving-cert-1894517521/tls.key\\\\\\\"\\\\nI1002 12:59:19.332110 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1002 12:59:19.335107 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1002 12:59:19.335132 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1002 12:59:19.335161 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1002 12:59:19.335167 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1002 12:59:19.339404 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1002 12:59:19.339433 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 12:59:19.339437 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 12:59:19.339442 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1002 12:59:19.339446 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1002 12:59:19.339452 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1002 12:59:19.339454 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1002 12:59:19.339646 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1002 12:59:19.342508 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T12:59:02Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://25e74bd1660931f7bb8dc824f985c539abcbd4c706a11edf7192876b8796e751\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:58:59Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://366ed0b9a29ba0b7606888087909b51b39abd60dd6b11e6509b8613913461b6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://366ed0b9a29ba0b7606888087909b51b39abd60dd6b11e6509b8613913461b6b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T12:58:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T12:58:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T12:58:56Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:27Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:27 crc kubenswrapper[4724]: I1002 12:59:27.949165 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-w58lt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4089ad23-969c-4222-a8ed-e141ec291e80\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sxqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sxqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sxqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sxqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sxqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sxqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sxqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sxqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9c7841b20007f0436754e669fb6f1a85de1f0ced1c0b2686ae0a20b56ab878f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9c7841b20007f0436754e669fb6f1a85de1f0ced1c0b2686ae0a20b56ab878f2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T12:59:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T12:59:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sxqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T12:59:21Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-w58lt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:27Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:27 crc kubenswrapper[4724]: I1002 12:59:27.976860 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 12:59:27 crc kubenswrapper[4724]: I1002 12:59:27.976905 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 12:59:27 crc kubenswrapper[4724]: I1002 12:59:27.976917 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 12:59:27 crc kubenswrapper[4724]: I1002 12:59:27.976937 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 12:59:27 crc kubenswrapper[4724]: I1002 12:59:27.976950 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T12:59:27Z","lastTransitionTime":"2025-10-02T12:59:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 12:59:27 crc kubenswrapper[4724]: I1002 12:59:27.985309 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-pr276" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c67e3c27-01fd-47f5-a7fb-a2ae1e7753d4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5fe9713cb9b004911140ef4802a6755a4d2a781fde439d7318ace93e4b608cef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c844q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T12:59:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-pr276\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:27Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:28 crc kubenswrapper[4724]: I1002 12:59:28.020830 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-vv7gr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"58e6c559-83ff-48ec-b337-ddd00852bc3c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://540350e6cb29bb395f31338d08626d3b110cc3adbe0df070045d178281d9c035\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4spmh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T12:59:26Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-vv7gr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:28Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:28 crc kubenswrapper[4724]: I1002 12:59:28.066105 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8902618f9e9502ff68d0658e199abb68025c551774c41022ed9f3f15dcccba47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:28Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:28 crc kubenswrapper[4724]: I1002 12:59:28.079471 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 12:59:28 crc kubenswrapper[4724]: I1002 12:59:28.079519 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 12:59:28 crc kubenswrapper[4724]: I1002 12:59:28.079549 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 12:59:28 crc kubenswrapper[4724]: I1002 12:59:28.079570 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 12:59:28 crc kubenswrapper[4724]: I1002 12:59:28.079585 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T12:59:28Z","lastTransitionTime":"2025-10-02T12:59:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 12:59:28 crc kubenswrapper[4724]: I1002 12:59:28.105227 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:28Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:28 crc kubenswrapper[4724]: I1002 12:59:28.148598 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:19Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:28Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:28 crc kubenswrapper[4724]: I1002 12:59:28.182854 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 12:59:28 crc kubenswrapper[4724]: I1002 12:59:28.182907 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 12:59:28 crc kubenswrapper[4724]: I1002 12:59:28.182921 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 12:59:28 crc kubenswrapper[4724]: I1002 12:59:28.182941 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 12:59:28 crc kubenswrapper[4724]: I1002 12:59:28.183013 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T12:59:28Z","lastTransitionTime":"2025-10-02T12:59:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 12:59:28 crc kubenswrapper[4724]: I1002 12:59:28.190820 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"92a0e520-2ed8-4d81-8de4-c0f98916b012\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:58:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:58:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:58:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8d0d24fc1f56963b40bca48e3e923a5b35f9bec48f0b4fabbbdc9163762b2aa6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:58:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cdf23aa5cabd12ba0af299a7184df0341b5a7f7fb6f8159a10c932599fcef08b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:58:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a087f1c52a561564ac405ef652e8cb1542077e507acff22e1189f7370b7ca322\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:58:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a675aca2218d07044b6e9a391f2b60f5cc822790011af97b0c0b704c1bd98d78\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:58:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T12:58:56Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:28Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:28 crc kubenswrapper[4724]: I1002 12:59:28.224852 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-2mrjk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e0d209f5-ad55-48f5-b4de-51aa5a972c19\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e94572c741d70917793b2482faf4c7fba57dfc4a29ade8824b35109735b602c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8f48\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T12:59:21Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-2mrjk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:28Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:28 crc kubenswrapper[4724]: I1002 12:59:28.285719 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 12:59:28 crc kubenswrapper[4724]: I1002 12:59:28.285790 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 12:59:28 crc kubenswrapper[4724]: I1002 12:59:28.285802 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 12:59:28 crc kubenswrapper[4724]: I1002 12:59:28.285824 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 12:59:28 crc kubenswrapper[4724]: I1002 12:59:28.285835 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T12:59:28Z","lastTransitionTime":"2025-10-02T12:59:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 12:59:28 crc kubenswrapper[4724]: I1002 12:59:28.388321 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 12:59:28 crc kubenswrapper[4724]: I1002 12:59:28.388381 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 12:59:28 crc kubenswrapper[4724]: I1002 12:59:28.388391 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 12:59:28 crc kubenswrapper[4724]: I1002 12:59:28.388411 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 12:59:28 crc kubenswrapper[4724]: I1002 12:59:28.388423 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T12:59:28Z","lastTransitionTime":"2025-10-02T12:59:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 12:59:28 crc kubenswrapper[4724]: I1002 12:59:28.491196 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 12:59:28 crc kubenswrapper[4724]: I1002 12:59:28.491243 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 12:59:28 crc kubenswrapper[4724]: I1002 12:59:28.491252 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 12:59:28 crc kubenswrapper[4724]: I1002 12:59:28.491306 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 12:59:28 crc kubenswrapper[4724]: I1002 12:59:28.491317 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T12:59:28Z","lastTransitionTime":"2025-10-02T12:59:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 12:59:28 crc kubenswrapper[4724]: I1002 12:59:28.590075 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-w58lt" event={"ID":"4089ad23-969c-4222-a8ed-e141ec291e80","Type":"ContainerStarted","Data":"1b14319f358f143317aa7440ec2f2b57507f3d05702bbbdba154b5f0bf845475"} Oct 02 12:59:28 crc kubenswrapper[4724]: I1002 12:59:28.590622 4724 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-w58lt" Oct 02 12:59:28 crc kubenswrapper[4724]: I1002 12:59:28.590664 4724 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-w58lt" Oct 02 12:59:28 crc kubenswrapper[4724]: I1002 12:59:28.593795 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 12:59:28 crc kubenswrapper[4724]: I1002 12:59:28.593845 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 12:59:28 crc kubenswrapper[4724]: I1002 12:59:28.593855 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 12:59:28 crc kubenswrapper[4724]: I1002 12:59:28.593907 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 12:59:28 crc kubenswrapper[4724]: I1002 12:59:28.593924 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T12:59:28Z","lastTransitionTime":"2025-10-02T12:59:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 12:59:28 crc kubenswrapper[4724]: I1002 12:59:28.597700 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-829dv" event={"ID":"b90b9e02-3565-4ad5-8f8c-eec339fc499c","Type":"ContainerStarted","Data":"c8b3f0e84cfea43942d32d84b09db7348002648527ea7d65a716489f8507eeb5"} Oct 02 12:59:28 crc kubenswrapper[4724]: I1002 12:59:28.608461 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:28Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:28 crc kubenswrapper[4724]: I1002 12:59:28.617391 4724 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-w58lt" Oct 02 12:59:28 crc kubenswrapper[4724]: I1002 12:59:28.618190 4724 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-w58lt" Oct 02 12:59:28 crc kubenswrapper[4724]: I1002 12:59:28.628344 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:19Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:28Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:28 crc kubenswrapper[4724]: I1002 12:59:28.645983 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-pr276" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c67e3c27-01fd-47f5-a7fb-a2ae1e7753d4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5fe9713cb9b004911140ef4802a6755a4d2a781fde439d7318ace93e4b608cef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c844q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T12:59:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-pr276\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:28Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:28 crc kubenswrapper[4724]: I1002 12:59:28.657975 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-vv7gr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"58e6c559-83ff-48ec-b337-ddd00852bc3c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://540350e6cb29bb395f31338d08626d3b110cc3adbe0df070045d178281d9c035\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4spmh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T12:59:26Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-vv7gr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:28Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:28 crc kubenswrapper[4724]: I1002 12:59:28.672910 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8902618f9e9502ff68d0658e199abb68025c551774c41022ed9f3f15dcccba47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:28Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:28 crc kubenswrapper[4724]: I1002 12:59:28.685190 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-2mrjk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e0d209f5-ad55-48f5-b4de-51aa5a972c19\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e94572c741d70917793b2482faf4c7fba57dfc4a29ade8824b35109735b602c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8f48\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T12:59:21Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-2mrjk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:28Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:28 crc kubenswrapper[4724]: I1002 12:59:28.696295 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 12:59:28 crc kubenswrapper[4724]: I1002 12:59:28.696589 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 12:59:28 crc kubenswrapper[4724]: I1002 12:59:28.696660 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 12:59:28 crc kubenswrapper[4724]: I1002 12:59:28.696732 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 12:59:28 crc kubenswrapper[4724]: I1002 12:59:28.696792 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T12:59:28Z","lastTransitionTime":"2025-10-02T12:59:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 12:59:28 crc kubenswrapper[4724]: I1002 12:59:28.698998 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"92a0e520-2ed8-4d81-8de4-c0f98916b012\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:58:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:58:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:58:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8d0d24fc1f56963b40bca48e3e923a5b35f9bec48f0b4fabbbdc9163762b2aa6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:58:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cdf23aa5cabd12ba0af299a7184df0341b5a7f7fb6f8159a10c932599fcef08b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:58:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a087f1c52a561564ac405ef652e8cb1542077e507acff22e1189f7370b7ca322\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:58:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a675aca2218d07044b6e9a391f2b60f5cc822790011af97b0c0b704c1bd98d78\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:58:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T12:58:56Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:28Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:28 crc kubenswrapper[4724]: I1002 12:59:28.712783 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://16ca55ea7e67fb6219b1c835d960d1f9ec6a18a98789ee961e44a278fad9add7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://72eefb1c1aad15d7ad34aa4384c7d0a20a4c886225cb7d28f0597cf01ad35693\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:28Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:28 crc kubenswrapper[4724]: I1002 12:59:28.725662 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:19Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:28Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:28 crc kubenswrapper[4724]: I1002 12:59:28.742671 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-829dv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b90b9e02-3565-4ad5-8f8c-eec339fc499c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:21Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vjwcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://55a18158dd67ef4667e4a6bc9684c4f4824f848b6e151a0f99c73e5ecfcc5ab1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://55a18158dd67ef4667e4a6bc9684c4f4824f848b6e151a0f99c73e5ecfcc5ab1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T12:59:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T12:59:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vjwcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://db711940b17fde8f8a00b44de8366688eb730facc06a174bf09456c12c673eb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://db711940b17fde8f8a00b44de8366688eb730facc06a174bf09456c12c673eb0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T12:59:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T12:59:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vjwcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a04ebb8ce267454f3068833092d9644e22484d6d3ea6c89e20fe396e5ea1fe9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a04ebb8ce267454f3068833092d9644e22484d6d3ea6c89e20fe396e5ea1fe9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T12:59:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T12:59:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vjwcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://771dcee276aac3571cff4769668348ea4a10d51e792d1604a47c57edac172dec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://771dcee276aac3571cff4769668348ea4a10d51e792d1604a47c57edac172dec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T12:59:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T12:59:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vjwcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vjwcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vjwcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T12:59:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-829dv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:28Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:28 crc kubenswrapper[4724]: I1002 12:59:28.753835 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-74k4t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6090eaa-c182-4788-950c-16352c271233\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a3396cd890f4ef8af351a4a3065c2324c20bddc7e079bb78e5a02e9fc8269c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fb6lr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4d5dbca156adf53ca2d3b237e15b6dd4387249e73dbf1a7fb3e7e39c53d1b548\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fb6lr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T12:59:21Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-74k4t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:28Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:28 crc kubenswrapper[4724]: I1002 12:59:28.764847 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df76441bbfa05afa242a77e10e6f855d811a9432b790630a9fa5f359ccb6d457\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:28Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:28 crc kubenswrapper[4724]: I1002 12:59:28.779321 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1527f3c8-7fa1-404b-aafd-7cac512df49d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:58:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:58:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:58:56Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:58:56Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:58:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f058a4a1cf35b7f4800f872139322f2b0096d67548945ba03ecfe6775ffd5103\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:58:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c06801afff9aa03ae9e0bcd8a966f454d81fdfe876806eed22e9d93132989dae\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:58:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff4c45bdd946a4d50c7dc93752d3d1a71b10b830611b64e138a5e1c0eb9e5e30\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:58:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a3585f00b9d5a19821986d13b0d46f0db3bdd6eef3b4e28f3a63449e0aa0e55\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a3585f00b9d5a19821986d13b0d46f0db3bdd6eef3b4e28f3a63449e0aa0e55\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-02T12:59:19Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1002 12:59:13.975124 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1002 12:59:13.976423 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1894517521/tls.crt::/tmp/serving-cert-1894517521/tls.key\\\\\\\"\\\\nI1002 12:59:19.332110 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1002 12:59:19.335107 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1002 12:59:19.335132 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1002 12:59:19.335161 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1002 12:59:19.335167 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1002 12:59:19.339404 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1002 12:59:19.339433 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 12:59:19.339437 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 12:59:19.339442 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1002 12:59:19.339446 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1002 12:59:19.339452 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1002 12:59:19.339454 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1002 12:59:19.339646 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1002 12:59:19.342508 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T12:59:02Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://25e74bd1660931f7bb8dc824f985c539abcbd4c706a11edf7192876b8796e751\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:58:59Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://366ed0b9a29ba0b7606888087909b51b39abd60dd6b11e6509b8613913461b6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://366ed0b9a29ba0b7606888087909b51b39abd60dd6b11e6509b8613913461b6b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T12:58:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T12:58:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T12:58:56Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:28Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:28 crc kubenswrapper[4724]: I1002 12:59:28.796685 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-w58lt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4089ad23-969c-4222-a8ed-e141ec291e80\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:21Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:21Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://197d101ac60dc4e22bd91f0bccaf0c5d9461d2bb94425b5c6d4ca75aad0f7e25\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sxqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3940d1762b165816101228f10689c4897ca3909b295c51368935e2d28d32e9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sxqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://300ea92580dcf51b685b378397b0dd067899d424f4394bce09a8242415acba64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sxqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6bc33e00e91c1177fe2762cb5295c78f4b00c1861d44446269e2e7668b25ac83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sxqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f4412d5281a9c03ea5b251d17371e363b9c5a970bad0db12adf3f75ddd1f955\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sxqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4186fbd2d282edbe581a4d8d3616dff86c3e6cdbe797999fa57f6034870e7742\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sxqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b14319f358f143317aa7440ec2f2b57507f3d05702bbbdba154b5f0bf845475\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sxqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c6cd68ea9e76b6dcb1649fafb808953b30389fcaee674b611c655c239a36e7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sxqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9c7841b20007f0436754e669fb6f1a85de1f0ced1c0b2686ae0a20b56ab878f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9c7841b20007f0436754e669fb6f1a85de1f0ced1c0b2686ae0a20b56ab878f2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T12:59:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T12:59:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sxqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T12:59:21Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-w58lt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:28Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:28 crc kubenswrapper[4724]: I1002 12:59:28.799261 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 12:59:28 crc kubenswrapper[4724]: I1002 12:59:28.799304 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 12:59:28 crc kubenswrapper[4724]: I1002 12:59:28.799313 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 12:59:28 crc kubenswrapper[4724]: I1002 12:59:28.799331 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 12:59:28 crc kubenswrapper[4724]: I1002 12:59:28.799344 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T12:59:28Z","lastTransitionTime":"2025-10-02T12:59:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 12:59:28 crc kubenswrapper[4724]: I1002 12:59:28.824853 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"92a0e520-2ed8-4d81-8de4-c0f98916b012\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:58:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:58:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:58:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8d0d24fc1f56963b40bca48e3e923a5b35f9bec48f0b4fabbbdc9163762b2aa6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:58:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cdf23aa5cabd12ba0af299a7184df0341b5a7f7fb6f8159a10c932599fcef08b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:58:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a087f1c52a561564ac405ef652e8cb1542077e507acff22e1189f7370b7ca322\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:58:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a675aca2218d07044b6e9a391f2b60f5cc822790011af97b0c0b704c1bd98d78\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:58:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T12:58:56Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:28Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:28 crc kubenswrapper[4724]: I1002 12:59:28.866020 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-2mrjk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e0d209f5-ad55-48f5-b4de-51aa5a972c19\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e94572c741d70917793b2482faf4c7fba57dfc4a29ade8824b35109735b602c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8f48\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T12:59:21Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-2mrjk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:28Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:28 crc kubenswrapper[4724]: I1002 12:59:28.902369 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 12:59:28 crc kubenswrapper[4724]: I1002 12:59:28.902441 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 12:59:28 crc kubenswrapper[4724]: I1002 12:59:28.902453 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 12:59:28 crc kubenswrapper[4724]: I1002 12:59:28.902474 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 12:59:28 crc kubenswrapper[4724]: I1002 12:59:28.902549 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T12:59:28Z","lastTransitionTime":"2025-10-02T12:59:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 12:59:28 crc kubenswrapper[4724]: I1002 12:59:28.904035 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-74k4t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6090eaa-c182-4788-950c-16352c271233\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a3396cd890f4ef8af351a4a3065c2324c20bddc7e079bb78e5a02e9fc8269c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fb6lr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4d5dbca156adf53ca2d3b237e15b6dd4387249e73dbf1a7fb3e7e39c53d1b548\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fb6lr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T12:59:21Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-74k4t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:28Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:28 crc kubenswrapper[4724]: I1002 12:59:28.945140 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df76441bbfa05afa242a77e10e6f855d811a9432b790630a9fa5f359ccb6d457\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:28Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:28 crc kubenswrapper[4724]: I1002 12:59:28.990881 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://16ca55ea7e67fb6219b1c835d960d1f9ec6a18a98789ee961e44a278fad9add7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://72eefb1c1aad15d7ad34aa4384c7d0a20a4c886225cb7d28f0597cf01ad35693\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:28Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:29 crc kubenswrapper[4724]: I1002 12:59:29.004723 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 12:59:29 crc kubenswrapper[4724]: I1002 12:59:29.004765 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 12:59:29 crc kubenswrapper[4724]: I1002 12:59:29.004775 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 12:59:29 crc kubenswrapper[4724]: I1002 12:59:29.004791 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 12:59:29 crc kubenswrapper[4724]: I1002 12:59:29.004803 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T12:59:29Z","lastTransitionTime":"2025-10-02T12:59:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 12:59:29 crc kubenswrapper[4724]: I1002 12:59:29.025214 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:19Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:29Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:29 crc kubenswrapper[4724]: I1002 12:59:29.065842 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-829dv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b90b9e02-3565-4ad5-8f8c-eec339fc499c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:21Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vjwcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://55a18158dd67ef4667e4a6bc9684c4f4824f848b6e151a0f99c73e5ecfcc5ab1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://55a18158dd67ef4667e4a6bc9684c4f4824f848b6e151a0f99c73e5ecfcc5ab1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T12:59:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T12:59:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vjwcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://db711940b17fde8f8a00b44de8366688eb730facc06a174bf09456c12c673eb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://db711940b17fde8f8a00b44de8366688eb730facc06a174bf09456c12c673eb0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T12:59:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T12:59:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vjwcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a04ebb8ce267454f3068833092d9644e22484d6d3ea6c89e20fe396e5ea1fe9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a04ebb8ce267454f3068833092d9644e22484d6d3ea6c89e20fe396e5ea1fe9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T12:59:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T12:59:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vjwcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://771dcee276aac3571cff4769668348ea4a10d51e792d1604a47c57edac172dec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://771dcee276aac3571cff4769668348ea4a10d51e792d1604a47c57edac172dec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T12:59:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T12:59:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vjwcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c8b3f0e84cfea43942d32d84b09db7348002648527ea7d65a716489f8507eeb5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vjwcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vjwcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T12:59:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-829dv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:29Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:29 crc kubenswrapper[4724]: I1002 12:59:29.106030 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1527f3c8-7fa1-404b-aafd-7cac512df49d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:58:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:58:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:58:56Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:58:56Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:58:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f058a4a1cf35b7f4800f872139322f2b0096d67548945ba03ecfe6775ffd5103\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:58:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c06801afff9aa03ae9e0bcd8a966f454d81fdfe876806eed22e9d93132989dae\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:58:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff4c45bdd946a4d50c7dc93752d3d1a71b10b830611b64e138a5e1c0eb9e5e30\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:58:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a3585f00b9d5a19821986d13b0d46f0db3bdd6eef3b4e28f3a63449e0aa0e55\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a3585f00b9d5a19821986d13b0d46f0db3bdd6eef3b4e28f3a63449e0aa0e55\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-02T12:59:19Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1002 12:59:13.975124 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1002 12:59:13.976423 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1894517521/tls.crt::/tmp/serving-cert-1894517521/tls.key\\\\\\\"\\\\nI1002 12:59:19.332110 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1002 12:59:19.335107 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1002 12:59:19.335132 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1002 12:59:19.335161 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1002 12:59:19.335167 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1002 12:59:19.339404 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1002 12:59:19.339433 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 12:59:19.339437 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 12:59:19.339442 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1002 12:59:19.339446 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1002 12:59:19.339452 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1002 12:59:19.339454 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1002 12:59:19.339646 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1002 12:59:19.342508 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T12:59:02Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://25e74bd1660931f7bb8dc824f985c539abcbd4c706a11edf7192876b8796e751\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:58:59Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://366ed0b9a29ba0b7606888087909b51b39abd60dd6b11e6509b8613913461b6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://366ed0b9a29ba0b7606888087909b51b39abd60dd6b11e6509b8613913461b6b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T12:58:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T12:58:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T12:58:56Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:29Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:29 crc kubenswrapper[4724]: I1002 12:59:29.107604 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 12:59:29 crc kubenswrapper[4724]: I1002 12:59:29.107657 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 12:59:29 crc kubenswrapper[4724]: I1002 12:59:29.107673 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 12:59:29 crc kubenswrapper[4724]: I1002 12:59:29.107695 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 12:59:29 crc kubenswrapper[4724]: I1002 12:59:29.107708 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T12:59:29Z","lastTransitionTime":"2025-10-02T12:59:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 12:59:29 crc kubenswrapper[4724]: I1002 12:59:29.152148 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-w58lt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4089ad23-969c-4222-a8ed-e141ec291e80\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://197d101ac60dc4e22bd91f0bccaf0c5d9461d2bb94425b5c6d4ca75aad0f7e25\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sxqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3940d1762b165816101228f10689c4897ca3909b295c51368935e2d28d32e9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sxqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://300ea92580dcf51b685b378397b0dd067899d424f4394bce09a8242415acba64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sxqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6bc33e00e91c1177fe2762cb5295c78f4b00c1861d44446269e2e7668b25ac83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sxqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f4412d5281a9c03ea5b251d17371e363b9c5a970bad0db12adf3f75ddd1f955\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sxqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4186fbd2d282edbe581a4d8d3616dff86c3e6cdbe797999fa57f6034870e7742\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sxqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b14319f358f143317aa7440ec2f2b57507f3d05702bbbdba154b5f0bf845475\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sxqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c6cd68ea9e76b6dcb1649fafb808953b30389fcaee674b611c655c239a36e7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sxqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9c7841b20007f0436754e669fb6f1a85de1f0ced1c0b2686ae0a20b56ab878f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9c7841b20007f0436754e669fb6f1a85de1f0ced1c0b2686ae0a20b56ab878f2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T12:59:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T12:59:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sxqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T12:59:21Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-w58lt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:29Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:29 crc kubenswrapper[4724]: I1002 12:59:29.183864 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-vv7gr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"58e6c559-83ff-48ec-b337-ddd00852bc3c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://540350e6cb29bb395f31338d08626d3b110cc3adbe0df070045d178281d9c035\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4spmh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T12:59:26Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-vv7gr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:29Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:29 crc kubenswrapper[4724]: I1002 12:59:29.210889 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 12:59:29 crc kubenswrapper[4724]: I1002 12:59:29.210932 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 12:59:29 crc kubenswrapper[4724]: I1002 12:59:29.210940 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 12:59:29 crc kubenswrapper[4724]: I1002 12:59:29.210957 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 12:59:29 crc kubenswrapper[4724]: I1002 12:59:29.210970 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T12:59:29Z","lastTransitionTime":"2025-10-02T12:59:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 12:59:29 crc kubenswrapper[4724]: I1002 12:59:29.227624 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8902618f9e9502ff68d0658e199abb68025c551774c41022ed9f3f15dcccba47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:29Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:29 crc kubenswrapper[4724]: I1002 12:59:29.268822 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:29Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:29 crc kubenswrapper[4724]: I1002 12:59:29.310191 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:19Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:29Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:29 crc kubenswrapper[4724]: I1002 12:59:29.312719 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 12:59:29 crc kubenswrapper[4724]: I1002 12:59:29.312795 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 12:59:29 crc kubenswrapper[4724]: I1002 12:59:29.312754 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 12:59:29 crc kubenswrapper[4724]: E1002 12:59:29.312925 4724 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 02 12:59:29 crc kubenswrapper[4724]: E1002 12:59:29.313029 4724 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 02 12:59:29 crc kubenswrapper[4724]: I1002 12:59:29.313173 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 12:59:29 crc kubenswrapper[4724]: I1002 12:59:29.313206 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 12:59:29 crc kubenswrapper[4724]: I1002 12:59:29.313220 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 12:59:29 crc kubenswrapper[4724]: I1002 12:59:29.313240 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 12:59:29 crc kubenswrapper[4724]: E1002 12:59:29.313236 4724 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 02 12:59:29 crc kubenswrapper[4724]: I1002 12:59:29.313253 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T12:59:29Z","lastTransitionTime":"2025-10-02T12:59:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 12:59:29 crc kubenswrapper[4724]: I1002 12:59:29.344631 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-pr276" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c67e3c27-01fd-47f5-a7fb-a2ae1e7753d4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5fe9713cb9b004911140ef4802a6755a4d2a781fde439d7318ace93e4b608cef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c844q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T12:59:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-pr276\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:29Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:29 crc kubenswrapper[4724]: I1002 12:59:29.416129 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 12:59:29 crc kubenswrapper[4724]: I1002 12:59:29.416177 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 12:59:29 crc kubenswrapper[4724]: I1002 12:59:29.416190 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 12:59:29 crc kubenswrapper[4724]: I1002 12:59:29.416208 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 12:59:29 crc kubenswrapper[4724]: I1002 12:59:29.416254 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T12:59:29Z","lastTransitionTime":"2025-10-02T12:59:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 12:59:29 crc kubenswrapper[4724]: I1002 12:59:29.518882 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 12:59:29 crc kubenswrapper[4724]: I1002 12:59:29.518936 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 12:59:29 crc kubenswrapper[4724]: I1002 12:59:29.518948 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 12:59:29 crc kubenswrapper[4724]: I1002 12:59:29.518966 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 12:59:29 crc kubenswrapper[4724]: I1002 12:59:29.518979 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T12:59:29Z","lastTransitionTime":"2025-10-02T12:59:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 12:59:29 crc kubenswrapper[4724]: I1002 12:59:29.604360 4724 generic.go:334] "Generic (PLEG): container finished" podID="b90b9e02-3565-4ad5-8f8c-eec339fc499c" containerID="c8b3f0e84cfea43942d32d84b09db7348002648527ea7d65a716489f8507eeb5" exitCode=0 Oct 02 12:59:29 crc kubenswrapper[4724]: I1002 12:59:29.604551 4724 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 02 12:59:29 crc kubenswrapper[4724]: I1002 12:59:29.604834 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-829dv" event={"ID":"b90b9e02-3565-4ad5-8f8c-eec339fc499c","Type":"ContainerDied","Data":"c8b3f0e84cfea43942d32d84b09db7348002648527ea7d65a716489f8507eeb5"} Oct 02 12:59:29 crc kubenswrapper[4724]: I1002 12:59:29.621317 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 12:59:29 crc kubenswrapper[4724]: I1002 12:59:29.621383 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 12:59:29 crc kubenswrapper[4724]: I1002 12:59:29.621402 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 12:59:29 crc kubenswrapper[4724]: I1002 12:59:29.621434 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 12:59:29 crc kubenswrapper[4724]: I1002 12:59:29.621458 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T12:59:29Z","lastTransitionTime":"2025-10-02T12:59:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 12:59:29 crc kubenswrapper[4724]: I1002 12:59:29.623789 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8902618f9e9502ff68d0658e199abb68025c551774c41022ed9f3f15dcccba47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:29Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:29 crc kubenswrapper[4724]: I1002 12:59:29.635022 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:29Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:29 crc kubenswrapper[4724]: I1002 12:59:29.650022 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:19Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:29Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:29 crc kubenswrapper[4724]: I1002 12:59:29.666765 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-pr276" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c67e3c27-01fd-47f5-a7fb-a2ae1e7753d4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5fe9713cb9b004911140ef4802a6755a4d2a781fde439d7318ace93e4b608cef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c844q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T12:59:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-pr276\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:29Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:29 crc kubenswrapper[4724]: I1002 12:59:29.679551 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-vv7gr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"58e6c559-83ff-48ec-b337-ddd00852bc3c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://540350e6cb29bb395f31338d08626d3b110cc3adbe0df070045d178281d9c035\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4spmh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T12:59:26Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-vv7gr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:29Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:29 crc kubenswrapper[4724]: I1002 12:59:29.697497 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"92a0e520-2ed8-4d81-8de4-c0f98916b012\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:58:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:58:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:58:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8d0d24fc1f56963b40bca48e3e923a5b35f9bec48f0b4fabbbdc9163762b2aa6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:58:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cdf23aa5cabd12ba0af299a7184df0341b5a7f7fb6f8159a10c932599fcef08b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:58:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a087f1c52a561564ac405ef652e8cb1542077e507acff22e1189f7370b7ca322\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:58:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a675aca2218d07044b6e9a391f2b60f5cc822790011af97b0c0b704c1bd98d78\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:58:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T12:58:56Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:29Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:29 crc kubenswrapper[4724]: I1002 12:59:29.711660 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-2mrjk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e0d209f5-ad55-48f5-b4de-51aa5a972c19\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e94572c741d70917793b2482faf4c7fba57dfc4a29ade8824b35109735b602c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8f48\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T12:59:21Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-2mrjk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:29Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:29 crc kubenswrapper[4724]: I1002 12:59:29.724305 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 12:59:29 crc kubenswrapper[4724]: I1002 12:59:29.724349 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 12:59:29 crc kubenswrapper[4724]: I1002 12:59:29.724359 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 12:59:29 crc kubenswrapper[4724]: I1002 12:59:29.724380 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 12:59:29 crc kubenswrapper[4724]: I1002 12:59:29.724393 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T12:59:29Z","lastTransitionTime":"2025-10-02T12:59:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 12:59:29 crc kubenswrapper[4724]: I1002 12:59:29.728005 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df76441bbfa05afa242a77e10e6f855d811a9432b790630a9fa5f359ccb6d457\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:29Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:29 crc kubenswrapper[4724]: I1002 12:59:29.743684 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://16ca55ea7e67fb6219b1c835d960d1f9ec6a18a98789ee961e44a278fad9add7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://72eefb1c1aad15d7ad34aa4384c7d0a20a4c886225cb7d28f0597cf01ad35693\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:29Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:29 crc kubenswrapper[4724]: I1002 12:59:29.763450 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:19Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:29Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:29 crc kubenswrapper[4724]: I1002 12:59:29.790366 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-829dv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b90b9e02-3565-4ad5-8f8c-eec339fc499c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:21Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vjwcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://55a18158dd67ef4667e4a6bc9684c4f4824f848b6e151a0f99c73e5ecfcc5ab1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://55a18158dd67ef4667e4a6bc9684c4f4824f848b6e151a0f99c73e5ecfcc5ab1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T12:59:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T12:59:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vjwcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://db711940b17fde8f8a00b44de8366688eb730facc06a174bf09456c12c673eb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://db711940b17fde8f8a00b44de8366688eb730facc06a174bf09456c12c673eb0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T12:59:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T12:59:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vjwcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a04ebb8ce267454f3068833092d9644e22484d6d3ea6c89e20fe396e5ea1fe9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a04ebb8ce267454f3068833092d9644e22484d6d3ea6c89e20fe396e5ea1fe9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T12:59:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T12:59:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vjwcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://771dcee276aac3571cff4769668348ea4a10d51e792d1604a47c57edac172dec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://771dcee276aac3571cff4769668348ea4a10d51e792d1604a47c57edac172dec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T12:59:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T12:59:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vjwcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c8b3f0e84cfea43942d32d84b09db7348002648527ea7d65a716489f8507eeb5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c8b3f0e84cfea43942d32d84b09db7348002648527ea7d65a716489f8507eeb5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T12:59:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T12:59:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vjwcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vjwcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T12:59:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-829dv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:29Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:29 crc kubenswrapper[4724]: I1002 12:59:29.828987 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 12:59:29 crc kubenswrapper[4724]: I1002 12:59:29.829042 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 12:59:29 crc kubenswrapper[4724]: I1002 12:59:29.829053 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 12:59:29 crc kubenswrapper[4724]: I1002 12:59:29.829068 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 12:59:29 crc kubenswrapper[4724]: I1002 12:59:29.829079 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T12:59:29Z","lastTransitionTime":"2025-10-02T12:59:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 12:59:29 crc kubenswrapper[4724]: I1002 12:59:29.830701 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-74k4t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6090eaa-c182-4788-950c-16352c271233\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a3396cd890f4ef8af351a4a3065c2324c20bddc7e079bb78e5a02e9fc8269c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fb6lr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4d5dbca156adf53ca2d3b237e15b6dd4387249e73dbf1a7fb3e7e39c53d1b548\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fb6lr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T12:59:21Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-74k4t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:29Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:29 crc kubenswrapper[4724]: I1002 12:59:29.866906 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1527f3c8-7fa1-404b-aafd-7cac512df49d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:58:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:58:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:58:56Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:58:56Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:58:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f058a4a1cf35b7f4800f872139322f2b0096d67548945ba03ecfe6775ffd5103\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:58:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c06801afff9aa03ae9e0bcd8a966f454d81fdfe876806eed22e9d93132989dae\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:58:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff4c45bdd946a4d50c7dc93752d3d1a71b10b830611b64e138a5e1c0eb9e5e30\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:58:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a3585f00b9d5a19821986d13b0d46f0db3bdd6eef3b4e28f3a63449e0aa0e55\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a3585f00b9d5a19821986d13b0d46f0db3bdd6eef3b4e28f3a63449e0aa0e55\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-02T12:59:19Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1002 12:59:13.975124 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1002 12:59:13.976423 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1894517521/tls.crt::/tmp/serving-cert-1894517521/tls.key\\\\\\\"\\\\nI1002 12:59:19.332110 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1002 12:59:19.335107 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1002 12:59:19.335132 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1002 12:59:19.335161 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1002 12:59:19.335167 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1002 12:59:19.339404 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1002 12:59:19.339433 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 12:59:19.339437 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 12:59:19.339442 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1002 12:59:19.339446 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1002 12:59:19.339452 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1002 12:59:19.339454 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1002 12:59:19.339646 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1002 12:59:19.342508 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T12:59:02Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://25e74bd1660931f7bb8dc824f985c539abcbd4c706a11edf7192876b8796e751\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:58:59Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://366ed0b9a29ba0b7606888087909b51b39abd60dd6b11e6509b8613913461b6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://366ed0b9a29ba0b7606888087909b51b39abd60dd6b11e6509b8613913461b6b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T12:58:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T12:58:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T12:58:56Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:29Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:29 crc kubenswrapper[4724]: I1002 12:59:29.913130 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-w58lt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4089ad23-969c-4222-a8ed-e141ec291e80\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://197d101ac60dc4e22bd91f0bccaf0c5d9461d2bb94425b5c6d4ca75aad0f7e25\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sxqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3940d1762b165816101228f10689c4897ca3909b295c51368935e2d28d32e9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sxqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://300ea92580dcf51b685b378397b0dd067899d424f4394bce09a8242415acba64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sxqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6bc33e00e91c1177fe2762cb5295c78f4b00c1861d44446269e2e7668b25ac83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sxqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f4412d5281a9c03ea5b251d17371e363b9c5a970bad0db12adf3f75ddd1f955\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sxqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4186fbd2d282edbe581a4d8d3616dff86c3e6cdbe797999fa57f6034870e7742\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sxqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b14319f358f143317aa7440ec2f2b57507f3d05702bbbdba154b5f0bf845475\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sxqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c6cd68ea9e76b6dcb1649fafb808953b30389fcaee674b611c655c239a36e7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sxqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9c7841b20007f0436754e669fb6f1a85de1f0ced1c0b2686ae0a20b56ab878f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9c7841b20007f0436754e669fb6f1a85de1f0ced1c0b2686ae0a20b56ab878f2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T12:59:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T12:59:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sxqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T12:59:21Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-w58lt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:29Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:29 crc kubenswrapper[4724]: I1002 12:59:29.932457 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 12:59:29 crc kubenswrapper[4724]: I1002 12:59:29.932509 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 12:59:29 crc kubenswrapper[4724]: I1002 12:59:29.932526 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 12:59:29 crc kubenswrapper[4724]: I1002 12:59:29.932572 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 12:59:29 crc kubenswrapper[4724]: I1002 12:59:29.932589 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T12:59:29Z","lastTransitionTime":"2025-10-02T12:59:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 12:59:30 crc kubenswrapper[4724]: I1002 12:59:30.036081 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 12:59:30 crc kubenswrapper[4724]: I1002 12:59:30.036133 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 12:59:30 crc kubenswrapper[4724]: I1002 12:59:30.036147 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 12:59:30 crc kubenswrapper[4724]: I1002 12:59:30.036171 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 12:59:30 crc kubenswrapper[4724]: I1002 12:59:30.036185 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T12:59:30Z","lastTransitionTime":"2025-10-02T12:59:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 12:59:30 crc kubenswrapper[4724]: I1002 12:59:30.139628 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 12:59:30 crc kubenswrapper[4724]: I1002 12:59:30.139968 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 12:59:30 crc kubenswrapper[4724]: I1002 12:59:30.139981 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 12:59:30 crc kubenswrapper[4724]: I1002 12:59:30.140002 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 12:59:30 crc kubenswrapper[4724]: I1002 12:59:30.140015 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T12:59:30Z","lastTransitionTime":"2025-10-02T12:59:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 12:59:30 crc kubenswrapper[4724]: I1002 12:59:30.242951 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 12:59:30 crc kubenswrapper[4724]: I1002 12:59:30.243005 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 12:59:30 crc kubenswrapper[4724]: I1002 12:59:30.243015 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 12:59:30 crc kubenswrapper[4724]: I1002 12:59:30.243043 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 12:59:30 crc kubenswrapper[4724]: I1002 12:59:30.243056 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T12:59:30Z","lastTransitionTime":"2025-10-02T12:59:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 12:59:30 crc kubenswrapper[4724]: I1002 12:59:30.346134 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 12:59:30 crc kubenswrapper[4724]: I1002 12:59:30.347250 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 12:59:30 crc kubenswrapper[4724]: I1002 12:59:30.347331 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 12:59:30 crc kubenswrapper[4724]: I1002 12:59:30.347386 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 12:59:30 crc kubenswrapper[4724]: I1002 12:59:30.347403 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T12:59:30Z","lastTransitionTime":"2025-10-02T12:59:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 12:59:30 crc kubenswrapper[4724]: I1002 12:59:30.450224 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 12:59:30 crc kubenswrapper[4724]: I1002 12:59:30.450284 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 12:59:30 crc kubenswrapper[4724]: I1002 12:59:30.450295 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 12:59:30 crc kubenswrapper[4724]: I1002 12:59:30.450316 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 12:59:30 crc kubenswrapper[4724]: I1002 12:59:30.450329 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T12:59:30Z","lastTransitionTime":"2025-10-02T12:59:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 12:59:30 crc kubenswrapper[4724]: I1002 12:59:30.553793 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 12:59:30 crc kubenswrapper[4724]: I1002 12:59:30.553847 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 12:59:30 crc kubenswrapper[4724]: I1002 12:59:30.553859 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 12:59:30 crc kubenswrapper[4724]: I1002 12:59:30.553878 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 12:59:30 crc kubenswrapper[4724]: I1002 12:59:30.553890 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T12:59:30Z","lastTransitionTime":"2025-10-02T12:59:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 12:59:30 crc kubenswrapper[4724]: I1002 12:59:30.612024 4724 generic.go:334] "Generic (PLEG): container finished" podID="b90b9e02-3565-4ad5-8f8c-eec339fc499c" containerID="1271c0f5d096f2fc134e178807b44e3a5980ca67d64f0e0d73f566b348206a55" exitCode=0 Oct 02 12:59:30 crc kubenswrapper[4724]: I1002 12:59:30.612100 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-829dv" event={"ID":"b90b9e02-3565-4ad5-8f8c-eec339fc499c","Type":"ContainerDied","Data":"1271c0f5d096f2fc134e178807b44e3a5980ca67d64f0e0d73f566b348206a55"} Oct 02 12:59:30 crc kubenswrapper[4724]: I1002 12:59:30.612257 4724 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 02 12:59:30 crc kubenswrapper[4724]: I1002 12:59:30.631411 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df76441bbfa05afa242a77e10e6f855d811a9432b790630a9fa5f359ccb6d457\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:30Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:30 crc kubenswrapper[4724]: I1002 12:59:30.648867 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://16ca55ea7e67fb6219b1c835d960d1f9ec6a18a98789ee961e44a278fad9add7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://72eefb1c1aad15d7ad34aa4384c7d0a20a4c886225cb7d28f0597cf01ad35693\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:30Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:30 crc kubenswrapper[4724]: I1002 12:59:30.659274 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 12:59:30 crc kubenswrapper[4724]: I1002 12:59:30.659341 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 12:59:30 crc kubenswrapper[4724]: I1002 12:59:30.659357 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 12:59:30 crc kubenswrapper[4724]: I1002 12:59:30.659379 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 12:59:30 crc kubenswrapper[4724]: I1002 12:59:30.659394 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T12:59:30Z","lastTransitionTime":"2025-10-02T12:59:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 12:59:30 crc kubenswrapper[4724]: I1002 12:59:30.668952 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:19Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:30Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:30 crc kubenswrapper[4724]: I1002 12:59:30.688204 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-829dv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b90b9e02-3565-4ad5-8f8c-eec339fc499c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vjwcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://55a18158dd67ef4667e4a6bc9684c4f4824f848b6e151a0f99c73e5ecfcc5ab1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://55a18158dd67ef4667e4a6bc9684c4f4824f848b6e151a0f99c73e5ecfcc5ab1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T12:59:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T12:59:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vjwcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://db711940b17fde8f8a00b44de8366688eb730facc06a174bf09456c12c673eb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://db711940b17fde8f8a00b44de8366688eb730facc06a174bf09456c12c673eb0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T12:59:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T12:59:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vjwcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a04ebb8ce267454f3068833092d9644e22484d6d3ea6c89e20fe396e5ea1fe9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a04ebb8ce267454f3068833092d9644e22484d6d3ea6c89e20fe396e5ea1fe9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T12:59:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T12:59:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vjwcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://771dcee276aac3571cff4769668348ea4a10d51e792d1604a47c57edac172dec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://771dcee276aac3571cff4769668348ea4a10d51e792d1604a47c57edac172dec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T12:59:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T12:59:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vjwcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c8b3f0e84cfea43942d32d84b09db7348002648527ea7d65a716489f8507eeb5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c8b3f0e84cfea43942d32d84b09db7348002648527ea7d65a716489f8507eeb5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T12:59:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T12:59:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vjwcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1271c0f5d096f2fc134e178807b44e3a5980ca67d64f0e0d73f566b348206a55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1271c0f5d096f2fc134e178807b44e3a5980ca67d64f0e0d73f566b348206a55\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T12:59:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T12:59:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vjwcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T12:59:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-829dv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:30Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:30 crc kubenswrapper[4724]: I1002 12:59:30.705287 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-74k4t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6090eaa-c182-4788-950c-16352c271233\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a3396cd890f4ef8af351a4a3065c2324c20bddc7e079bb78e5a02e9fc8269c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fb6lr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4d5dbca156adf53ca2d3b237e15b6dd4387249e73dbf1a7fb3e7e39c53d1b548\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fb6lr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T12:59:21Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-74k4t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:30Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:30 crc kubenswrapper[4724]: I1002 12:59:30.723684 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1527f3c8-7fa1-404b-aafd-7cac512df49d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:58:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:58:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:58:56Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:58:56Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:58:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f058a4a1cf35b7f4800f872139322f2b0096d67548945ba03ecfe6775ffd5103\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:58:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c06801afff9aa03ae9e0bcd8a966f454d81fdfe876806eed22e9d93132989dae\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:58:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff4c45bdd946a4d50c7dc93752d3d1a71b10b830611b64e138a5e1c0eb9e5e30\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:58:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a3585f00b9d5a19821986d13b0d46f0db3bdd6eef3b4e28f3a63449e0aa0e55\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a3585f00b9d5a19821986d13b0d46f0db3bdd6eef3b4e28f3a63449e0aa0e55\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-02T12:59:19Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1002 12:59:13.975124 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1002 12:59:13.976423 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1894517521/tls.crt::/tmp/serving-cert-1894517521/tls.key\\\\\\\"\\\\nI1002 12:59:19.332110 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1002 12:59:19.335107 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1002 12:59:19.335132 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1002 12:59:19.335161 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1002 12:59:19.335167 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1002 12:59:19.339404 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1002 12:59:19.339433 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 12:59:19.339437 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 12:59:19.339442 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1002 12:59:19.339446 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1002 12:59:19.339452 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1002 12:59:19.339454 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1002 12:59:19.339646 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1002 12:59:19.342508 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T12:59:02Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://25e74bd1660931f7bb8dc824f985c539abcbd4c706a11edf7192876b8796e751\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:58:59Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://366ed0b9a29ba0b7606888087909b51b39abd60dd6b11e6509b8613913461b6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://366ed0b9a29ba0b7606888087909b51b39abd60dd6b11e6509b8613913461b6b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T12:58:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T12:58:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T12:58:56Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:30Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:30 crc kubenswrapper[4724]: I1002 12:59:30.748730 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-w58lt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4089ad23-969c-4222-a8ed-e141ec291e80\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://197d101ac60dc4e22bd91f0bccaf0c5d9461d2bb94425b5c6d4ca75aad0f7e25\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sxqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3940d1762b165816101228f10689c4897ca3909b295c51368935e2d28d32e9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sxqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://300ea92580dcf51b685b378397b0dd067899d424f4394bce09a8242415acba64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sxqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6bc33e00e91c1177fe2762cb5295c78f4b00c1861d44446269e2e7668b25ac83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sxqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f4412d5281a9c03ea5b251d17371e363b9c5a970bad0db12adf3f75ddd1f955\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sxqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4186fbd2d282edbe581a4d8d3616dff86c3e6cdbe797999fa57f6034870e7742\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sxqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b14319f358f143317aa7440ec2f2b57507f3d05702bbbdba154b5f0bf845475\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sxqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c6cd68ea9e76b6dcb1649fafb808953b30389fcaee674b611c655c239a36e7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sxqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9c7841b20007f0436754e669fb6f1a85de1f0ced1c0b2686ae0a20b56ab878f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9c7841b20007f0436754e669fb6f1a85de1f0ced1c0b2686ae0a20b56ab878f2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T12:59:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T12:59:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sxqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T12:59:21Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-w58lt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:30Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:30 crc kubenswrapper[4724]: I1002 12:59:30.761961 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 12:59:30 crc kubenswrapper[4724]: I1002 12:59:30.762017 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 12:59:30 crc kubenswrapper[4724]: I1002 12:59:30.762070 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 12:59:30 crc kubenswrapper[4724]: I1002 12:59:30.762096 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 12:59:30 crc kubenswrapper[4724]: I1002 12:59:30.762111 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T12:59:30Z","lastTransitionTime":"2025-10-02T12:59:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 12:59:30 crc kubenswrapper[4724]: I1002 12:59:30.765707 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8902618f9e9502ff68d0658e199abb68025c551774c41022ed9f3f15dcccba47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:30Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:30 crc kubenswrapper[4724]: I1002 12:59:30.781108 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:30Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:30 crc kubenswrapper[4724]: I1002 12:59:30.799420 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:19Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:30Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:30 crc kubenswrapper[4724]: I1002 12:59:30.814520 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-pr276" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c67e3c27-01fd-47f5-a7fb-a2ae1e7753d4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5fe9713cb9b004911140ef4802a6755a4d2a781fde439d7318ace93e4b608cef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c844q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T12:59:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-pr276\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:30Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:30 crc kubenswrapper[4724]: I1002 12:59:30.827339 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-vv7gr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"58e6c559-83ff-48ec-b337-ddd00852bc3c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://540350e6cb29bb395f31338d08626d3b110cc3adbe0df070045d178281d9c035\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4spmh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T12:59:26Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-vv7gr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:30Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:30 crc kubenswrapper[4724]: I1002 12:59:30.842070 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"92a0e520-2ed8-4d81-8de4-c0f98916b012\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:58:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:58:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:58:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8d0d24fc1f56963b40bca48e3e923a5b35f9bec48f0b4fabbbdc9163762b2aa6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:58:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cdf23aa5cabd12ba0af299a7184df0341b5a7f7fb6f8159a10c932599fcef08b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:58:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a087f1c52a561564ac405ef652e8cb1542077e507acff22e1189f7370b7ca322\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:58:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a675aca2218d07044b6e9a391f2b60f5cc822790011af97b0c0b704c1bd98d78\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:58:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T12:58:56Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:30Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:30 crc kubenswrapper[4724]: I1002 12:59:30.858577 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-2mrjk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e0d209f5-ad55-48f5-b4de-51aa5a972c19\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e94572c741d70917793b2482faf4c7fba57dfc4a29ade8824b35109735b602c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8f48\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T12:59:21Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-2mrjk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:30Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:30 crc kubenswrapper[4724]: I1002 12:59:30.865138 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 12:59:30 crc kubenswrapper[4724]: I1002 12:59:30.865193 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 12:59:30 crc kubenswrapper[4724]: I1002 12:59:30.865203 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 12:59:30 crc kubenswrapper[4724]: I1002 12:59:30.865221 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 12:59:30 crc kubenswrapper[4724]: I1002 12:59:30.865234 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T12:59:30Z","lastTransitionTime":"2025-10-02T12:59:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 12:59:30 crc kubenswrapper[4724]: I1002 12:59:30.968162 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 12:59:30 crc kubenswrapper[4724]: I1002 12:59:30.968203 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 12:59:30 crc kubenswrapper[4724]: I1002 12:59:30.968212 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 12:59:30 crc kubenswrapper[4724]: I1002 12:59:30.968229 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 12:59:30 crc kubenswrapper[4724]: I1002 12:59:30.968239 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T12:59:30Z","lastTransitionTime":"2025-10-02T12:59:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 12:59:31 crc kubenswrapper[4724]: I1002 12:59:31.071998 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 12:59:31 crc kubenswrapper[4724]: I1002 12:59:31.072062 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 12:59:31 crc kubenswrapper[4724]: I1002 12:59:31.072077 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 12:59:31 crc kubenswrapper[4724]: I1002 12:59:31.072107 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 12:59:31 crc kubenswrapper[4724]: I1002 12:59:31.072126 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T12:59:31Z","lastTransitionTime":"2025-10-02T12:59:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 12:59:31 crc kubenswrapper[4724]: I1002 12:59:31.175313 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 12:59:31 crc kubenswrapper[4724]: I1002 12:59:31.175367 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 12:59:31 crc kubenswrapper[4724]: I1002 12:59:31.175377 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 12:59:31 crc kubenswrapper[4724]: I1002 12:59:31.175424 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 12:59:31 crc kubenswrapper[4724]: I1002 12:59:31.175434 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T12:59:31Z","lastTransitionTime":"2025-10-02T12:59:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 12:59:31 crc kubenswrapper[4724]: I1002 12:59:31.278940 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 12:59:31 crc kubenswrapper[4724]: I1002 12:59:31.279001 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 12:59:31 crc kubenswrapper[4724]: I1002 12:59:31.279014 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 12:59:31 crc kubenswrapper[4724]: I1002 12:59:31.279034 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 12:59:31 crc kubenswrapper[4724]: I1002 12:59:31.279050 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T12:59:31Z","lastTransitionTime":"2025-10-02T12:59:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 12:59:31 crc kubenswrapper[4724]: I1002 12:59:31.313620 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 12:59:31 crc kubenswrapper[4724]: I1002 12:59:31.313669 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 12:59:31 crc kubenswrapper[4724]: I1002 12:59:31.313720 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 12:59:31 crc kubenswrapper[4724]: E1002 12:59:31.313817 4724 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 02 12:59:31 crc kubenswrapper[4724]: E1002 12:59:31.313943 4724 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 02 12:59:31 crc kubenswrapper[4724]: E1002 12:59:31.314080 4724 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 02 12:59:31 crc kubenswrapper[4724]: I1002 12:59:31.381820 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 12:59:31 crc kubenswrapper[4724]: I1002 12:59:31.381884 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 12:59:31 crc kubenswrapper[4724]: I1002 12:59:31.381900 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 12:59:31 crc kubenswrapper[4724]: I1002 12:59:31.381925 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 12:59:31 crc kubenswrapper[4724]: I1002 12:59:31.381955 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T12:59:31Z","lastTransitionTime":"2025-10-02T12:59:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 12:59:31 crc kubenswrapper[4724]: I1002 12:59:31.484833 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 12:59:31 crc kubenswrapper[4724]: I1002 12:59:31.484895 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 12:59:31 crc kubenswrapper[4724]: I1002 12:59:31.484912 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 12:59:31 crc kubenswrapper[4724]: I1002 12:59:31.484943 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 12:59:31 crc kubenswrapper[4724]: I1002 12:59:31.484957 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T12:59:31Z","lastTransitionTime":"2025-10-02T12:59:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 12:59:31 crc kubenswrapper[4724]: I1002 12:59:31.588086 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 12:59:31 crc kubenswrapper[4724]: I1002 12:59:31.588152 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 12:59:31 crc kubenswrapper[4724]: I1002 12:59:31.588163 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 12:59:31 crc kubenswrapper[4724]: I1002 12:59:31.588187 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 12:59:31 crc kubenswrapper[4724]: I1002 12:59:31.588200 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T12:59:31Z","lastTransitionTime":"2025-10-02T12:59:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 12:59:31 crc kubenswrapper[4724]: I1002 12:59:31.624238 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-829dv" event={"ID":"b90b9e02-3565-4ad5-8f8c-eec339fc499c","Type":"ContainerStarted","Data":"12d1d4558bef1a0a974bf27f9547c646e216ed5001227ab980e89b05c2a7b5ef"} Oct 02 12:59:31 crc kubenswrapper[4724]: I1002 12:59:31.638147 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df76441bbfa05afa242a77e10e6f855d811a9432b790630a9fa5f359ccb6d457\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:31Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:31 crc kubenswrapper[4724]: I1002 12:59:31.653379 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://16ca55ea7e67fb6219b1c835d960d1f9ec6a18a98789ee961e44a278fad9add7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://72eefb1c1aad15d7ad34aa4384c7d0a20a4c886225cb7d28f0597cf01ad35693\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:31Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:31 crc kubenswrapper[4724]: I1002 12:59:31.668874 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:19Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:31Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:31 crc kubenswrapper[4724]: I1002 12:59:31.690666 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-829dv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b90b9e02-3565-4ad5-8f8c-eec339fc499c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://12d1d4558bef1a0a974bf27f9547c646e216ed5001227ab980e89b05c2a7b5ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vjwcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://55a18158dd67ef4667e4a6bc9684c4f4824f848b6e151a0f99c73e5ecfcc5ab1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://55a18158dd67ef4667e4a6bc9684c4f4824f848b6e151a0f99c73e5ecfcc5ab1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T12:59:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T12:59:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vjwcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://db711940b17fde8f8a00b44de8366688eb730facc06a174bf09456c12c673eb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://db711940b17fde8f8a00b44de8366688eb730facc06a174bf09456c12c673eb0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T12:59:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T12:59:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vjwcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a04ebb8ce267454f3068833092d9644e22484d6d3ea6c89e20fe396e5ea1fe9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a04ebb8ce267454f3068833092d9644e22484d6d3ea6c89e20fe396e5ea1fe9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T12:59:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T12:59:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vjwcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://771dcee276aac3571cff4769668348ea4a10d51e792d1604a47c57edac172dec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://771dcee276aac3571cff4769668348ea4a10d51e792d1604a47c57edac172dec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T12:59:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T12:59:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vjwcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c8b3f0e84cfea43942d32d84b09db7348002648527ea7d65a716489f8507eeb5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c8b3f0e84cfea43942d32d84b09db7348002648527ea7d65a716489f8507eeb5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T12:59:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T12:59:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vjwcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1271c0f5d096f2fc134e178807b44e3a5980ca67d64f0e0d73f566b348206a55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1271c0f5d096f2fc134e178807b44e3a5980ca67d64f0e0d73f566b348206a55\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T12:59:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T12:59:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vjwcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T12:59:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-829dv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:31Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:31 crc kubenswrapper[4724]: I1002 12:59:31.691711 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 12:59:31 crc kubenswrapper[4724]: I1002 12:59:31.691785 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 12:59:31 crc kubenswrapper[4724]: I1002 12:59:31.691798 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 12:59:31 crc kubenswrapper[4724]: I1002 12:59:31.691816 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 12:59:31 crc kubenswrapper[4724]: I1002 12:59:31.691825 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T12:59:31Z","lastTransitionTime":"2025-10-02T12:59:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 12:59:31 crc kubenswrapper[4724]: I1002 12:59:31.706245 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-74k4t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6090eaa-c182-4788-950c-16352c271233\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a3396cd890f4ef8af351a4a3065c2324c20bddc7e079bb78e5a02e9fc8269c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fb6lr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4d5dbca156adf53ca2d3b237e15b6dd4387249e73dbf1a7fb3e7e39c53d1b548\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fb6lr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T12:59:21Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-74k4t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:31Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:31 crc kubenswrapper[4724]: I1002 12:59:31.723992 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1527f3c8-7fa1-404b-aafd-7cac512df49d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:58:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:58:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:58:56Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:58:56Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:58:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f058a4a1cf35b7f4800f872139322f2b0096d67548945ba03ecfe6775ffd5103\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:58:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c06801afff9aa03ae9e0bcd8a966f454d81fdfe876806eed22e9d93132989dae\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:58:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff4c45bdd946a4d50c7dc93752d3d1a71b10b830611b64e138a5e1c0eb9e5e30\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:58:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a3585f00b9d5a19821986d13b0d46f0db3bdd6eef3b4e28f3a63449e0aa0e55\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a3585f00b9d5a19821986d13b0d46f0db3bdd6eef3b4e28f3a63449e0aa0e55\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-02T12:59:19Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1002 12:59:13.975124 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1002 12:59:13.976423 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1894517521/tls.crt::/tmp/serving-cert-1894517521/tls.key\\\\\\\"\\\\nI1002 12:59:19.332110 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1002 12:59:19.335107 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1002 12:59:19.335132 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1002 12:59:19.335161 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1002 12:59:19.335167 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1002 12:59:19.339404 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1002 12:59:19.339433 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 12:59:19.339437 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 12:59:19.339442 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1002 12:59:19.339446 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1002 12:59:19.339452 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1002 12:59:19.339454 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1002 12:59:19.339646 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1002 12:59:19.342508 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T12:59:02Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://25e74bd1660931f7bb8dc824f985c539abcbd4c706a11edf7192876b8796e751\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:58:59Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://366ed0b9a29ba0b7606888087909b51b39abd60dd6b11e6509b8613913461b6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://366ed0b9a29ba0b7606888087909b51b39abd60dd6b11e6509b8613913461b6b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T12:58:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T12:58:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T12:58:56Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:31Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:31 crc kubenswrapper[4724]: I1002 12:59:31.743312 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-w58lt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4089ad23-969c-4222-a8ed-e141ec291e80\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://197d101ac60dc4e22bd91f0bccaf0c5d9461d2bb94425b5c6d4ca75aad0f7e25\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sxqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3940d1762b165816101228f10689c4897ca3909b295c51368935e2d28d32e9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sxqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://300ea92580dcf51b685b378397b0dd067899d424f4394bce09a8242415acba64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sxqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6bc33e00e91c1177fe2762cb5295c78f4b00c1861d44446269e2e7668b25ac83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sxqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f4412d5281a9c03ea5b251d17371e363b9c5a970bad0db12adf3f75ddd1f955\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sxqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4186fbd2d282edbe581a4d8d3616dff86c3e6cdbe797999fa57f6034870e7742\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sxqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b14319f358f143317aa7440ec2f2b57507f3d05702bbbdba154b5f0bf845475\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sxqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c6cd68ea9e76b6dcb1649fafb808953b30389fcaee674b611c655c239a36e7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sxqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9c7841b20007f0436754e669fb6f1a85de1f0ced1c0b2686ae0a20b56ab878f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9c7841b20007f0436754e669fb6f1a85de1f0ced1c0b2686ae0a20b56ab878f2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T12:59:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T12:59:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sxqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T12:59:21Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-w58lt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:31Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:31 crc kubenswrapper[4724]: I1002 12:59:31.761366 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8902618f9e9502ff68d0658e199abb68025c551774c41022ed9f3f15dcccba47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:31Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:31 crc kubenswrapper[4724]: I1002 12:59:31.775431 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:31Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:31 crc kubenswrapper[4724]: I1002 12:59:31.792441 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:19Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:31Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:31 crc kubenswrapper[4724]: I1002 12:59:31.794684 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 12:59:31 crc kubenswrapper[4724]: I1002 12:59:31.794746 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 12:59:31 crc kubenswrapper[4724]: I1002 12:59:31.794760 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 12:59:31 crc kubenswrapper[4724]: I1002 12:59:31.794781 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 12:59:31 crc kubenswrapper[4724]: I1002 12:59:31.794794 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T12:59:31Z","lastTransitionTime":"2025-10-02T12:59:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 12:59:31 crc kubenswrapper[4724]: I1002 12:59:31.810454 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-pr276" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c67e3c27-01fd-47f5-a7fb-a2ae1e7753d4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5fe9713cb9b004911140ef4802a6755a4d2a781fde439d7318ace93e4b608cef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c844q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T12:59:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-pr276\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:31Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:31 crc kubenswrapper[4724]: I1002 12:59:31.825185 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-vv7gr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"58e6c559-83ff-48ec-b337-ddd00852bc3c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://540350e6cb29bb395f31338d08626d3b110cc3adbe0df070045d178281d9c035\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4spmh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T12:59:26Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-vv7gr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:31Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:31 crc kubenswrapper[4724]: I1002 12:59:31.842129 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"92a0e520-2ed8-4d81-8de4-c0f98916b012\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:58:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:58:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:58:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8d0d24fc1f56963b40bca48e3e923a5b35f9bec48f0b4fabbbdc9163762b2aa6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:58:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cdf23aa5cabd12ba0af299a7184df0341b5a7f7fb6f8159a10c932599fcef08b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:58:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a087f1c52a561564ac405ef652e8cb1542077e507acff22e1189f7370b7ca322\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:58:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a675aca2218d07044b6e9a391f2b60f5cc822790011af97b0c0b704c1bd98d78\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:58:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T12:58:56Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:31Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:31 crc kubenswrapper[4724]: I1002 12:59:31.858206 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-2mrjk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e0d209f5-ad55-48f5-b4de-51aa5a972c19\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e94572c741d70917793b2482faf4c7fba57dfc4a29ade8824b35109735b602c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8f48\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T12:59:21Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-2mrjk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:31Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:31 crc kubenswrapper[4724]: I1002 12:59:31.898510 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 12:59:31 crc kubenswrapper[4724]: I1002 12:59:31.898572 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 12:59:31 crc kubenswrapper[4724]: I1002 12:59:31.898588 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 12:59:31 crc kubenswrapper[4724]: I1002 12:59:31.898605 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 12:59:31 crc kubenswrapper[4724]: I1002 12:59:31.898617 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T12:59:31Z","lastTransitionTime":"2025-10-02T12:59:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 12:59:32 crc kubenswrapper[4724]: I1002 12:59:32.001491 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 12:59:32 crc kubenswrapper[4724]: I1002 12:59:32.001565 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 12:59:32 crc kubenswrapper[4724]: I1002 12:59:32.001586 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 12:59:32 crc kubenswrapper[4724]: I1002 12:59:32.001610 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 12:59:32 crc kubenswrapper[4724]: I1002 12:59:32.001627 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T12:59:32Z","lastTransitionTime":"2025-10-02T12:59:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 12:59:32 crc kubenswrapper[4724]: I1002 12:59:32.104747 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 12:59:32 crc kubenswrapper[4724]: I1002 12:59:32.104797 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 12:59:32 crc kubenswrapper[4724]: I1002 12:59:32.104811 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 12:59:32 crc kubenswrapper[4724]: I1002 12:59:32.104830 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 12:59:32 crc kubenswrapper[4724]: I1002 12:59:32.104842 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T12:59:32Z","lastTransitionTime":"2025-10-02T12:59:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 12:59:32 crc kubenswrapper[4724]: I1002 12:59:32.207742 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 12:59:32 crc kubenswrapper[4724]: I1002 12:59:32.207802 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 12:59:32 crc kubenswrapper[4724]: I1002 12:59:32.207819 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 12:59:32 crc kubenswrapper[4724]: I1002 12:59:32.207843 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 12:59:32 crc kubenswrapper[4724]: I1002 12:59:32.207858 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T12:59:32Z","lastTransitionTime":"2025-10-02T12:59:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 12:59:32 crc kubenswrapper[4724]: I1002 12:59:32.310281 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 12:59:32 crc kubenswrapper[4724]: I1002 12:59:32.310319 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 12:59:32 crc kubenswrapper[4724]: I1002 12:59:32.310334 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 12:59:32 crc kubenswrapper[4724]: I1002 12:59:32.310368 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 12:59:32 crc kubenswrapper[4724]: I1002 12:59:32.310386 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T12:59:32Z","lastTransitionTime":"2025-10-02T12:59:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 12:59:32 crc kubenswrapper[4724]: I1002 12:59:32.412956 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 12:59:32 crc kubenswrapper[4724]: I1002 12:59:32.413015 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 12:59:32 crc kubenswrapper[4724]: I1002 12:59:32.413027 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 12:59:32 crc kubenswrapper[4724]: I1002 12:59:32.413049 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 12:59:32 crc kubenswrapper[4724]: I1002 12:59:32.413063 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T12:59:32Z","lastTransitionTime":"2025-10-02T12:59:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 12:59:32 crc kubenswrapper[4724]: I1002 12:59:32.516549 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 12:59:32 crc kubenswrapper[4724]: I1002 12:59:32.516593 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 12:59:32 crc kubenswrapper[4724]: I1002 12:59:32.516603 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 12:59:32 crc kubenswrapper[4724]: I1002 12:59:32.516618 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 12:59:32 crc kubenswrapper[4724]: I1002 12:59:32.516629 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T12:59:32Z","lastTransitionTime":"2025-10-02T12:59:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 12:59:32 crc kubenswrapper[4724]: I1002 12:59:32.619002 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 12:59:32 crc kubenswrapper[4724]: I1002 12:59:32.619067 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 12:59:32 crc kubenswrapper[4724]: I1002 12:59:32.619081 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 12:59:32 crc kubenswrapper[4724]: I1002 12:59:32.619104 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 12:59:32 crc kubenswrapper[4724]: I1002 12:59:32.619117 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T12:59:32Z","lastTransitionTime":"2025-10-02T12:59:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 12:59:32 crc kubenswrapper[4724]: I1002 12:59:32.628667 4724 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-w58lt_4089ad23-969c-4222-a8ed-e141ec291e80/ovnkube-controller/0.log" Oct 02 12:59:32 crc kubenswrapper[4724]: I1002 12:59:32.631935 4724 generic.go:334] "Generic (PLEG): container finished" podID="4089ad23-969c-4222-a8ed-e141ec291e80" containerID="1b14319f358f143317aa7440ec2f2b57507f3d05702bbbdba154b5f0bf845475" exitCode=1 Oct 02 12:59:32 crc kubenswrapper[4724]: I1002 12:59:32.632043 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-w58lt" event={"ID":"4089ad23-969c-4222-a8ed-e141ec291e80","Type":"ContainerDied","Data":"1b14319f358f143317aa7440ec2f2b57507f3d05702bbbdba154b5f0bf845475"} Oct 02 12:59:32 crc kubenswrapper[4724]: I1002 12:59:32.633184 4724 scope.go:117] "RemoveContainer" containerID="1b14319f358f143317aa7440ec2f2b57507f3d05702bbbdba154b5f0bf845475" Oct 02 12:59:32 crc kubenswrapper[4724]: I1002 12:59:32.648571 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-vv7gr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"58e6c559-83ff-48ec-b337-ddd00852bc3c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://540350e6cb29bb395f31338d08626d3b110cc3adbe0df070045d178281d9c035\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4spmh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T12:59:26Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-vv7gr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:32Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:32 crc kubenswrapper[4724]: I1002 12:59:32.666797 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8902618f9e9502ff68d0658e199abb68025c551774c41022ed9f3f15dcccba47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:32Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:32 crc kubenswrapper[4724]: I1002 12:59:32.688587 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:32Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:32 crc kubenswrapper[4724]: I1002 12:59:32.705832 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:19Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:32Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:32 crc kubenswrapper[4724]: I1002 12:59:32.722624 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 12:59:32 crc kubenswrapper[4724]: I1002 12:59:32.722679 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 12:59:32 crc kubenswrapper[4724]: I1002 12:59:32.722695 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 12:59:32 crc kubenswrapper[4724]: I1002 12:59:32.722714 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 12:59:32 crc kubenswrapper[4724]: I1002 12:59:32.722726 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T12:59:32Z","lastTransitionTime":"2025-10-02T12:59:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 12:59:32 crc kubenswrapper[4724]: I1002 12:59:32.724598 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-pr276" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c67e3c27-01fd-47f5-a7fb-a2ae1e7753d4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5fe9713cb9b004911140ef4802a6755a4d2a781fde439d7318ace93e4b608cef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c844q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T12:59:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-pr276\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:32Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:32 crc kubenswrapper[4724]: I1002 12:59:32.740267 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"92a0e520-2ed8-4d81-8de4-c0f98916b012\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:58:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:58:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:58:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8d0d24fc1f56963b40bca48e3e923a5b35f9bec48f0b4fabbbdc9163762b2aa6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:58:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cdf23aa5cabd12ba0af299a7184df0341b5a7f7fb6f8159a10c932599fcef08b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:58:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a087f1c52a561564ac405ef652e8cb1542077e507acff22e1189f7370b7ca322\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:58:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a675aca2218d07044b6e9a391f2b60f5cc822790011af97b0c0b704c1bd98d78\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:58:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T12:58:56Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:32Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:32 crc kubenswrapper[4724]: I1002 12:59:32.752767 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-2mrjk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e0d209f5-ad55-48f5-b4de-51aa5a972c19\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e94572c741d70917793b2482faf4c7fba57dfc4a29ade8824b35109735b602c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8f48\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T12:59:21Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-2mrjk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:32Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:32 crc kubenswrapper[4724]: I1002 12:59:32.768073 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-74k4t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6090eaa-c182-4788-950c-16352c271233\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a3396cd890f4ef8af351a4a3065c2324c20bddc7e079bb78e5a02e9fc8269c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fb6lr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4d5dbca156adf53ca2d3b237e15b6dd4387249e73dbf1a7fb3e7e39c53d1b548\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fb6lr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T12:59:21Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-74k4t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:32Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:32 crc kubenswrapper[4724]: I1002 12:59:32.784107 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df76441bbfa05afa242a77e10e6f855d811a9432b790630a9fa5f359ccb6d457\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:32Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:32 crc kubenswrapper[4724]: I1002 12:59:32.804812 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://16ca55ea7e67fb6219b1c835d960d1f9ec6a18a98789ee961e44a278fad9add7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://72eefb1c1aad15d7ad34aa4384c7d0a20a4c886225cb7d28f0597cf01ad35693\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:32Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:32 crc kubenswrapper[4724]: I1002 12:59:32.823478 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:19Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:32Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:32 crc kubenswrapper[4724]: I1002 12:59:32.824976 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 12:59:32 crc kubenswrapper[4724]: I1002 12:59:32.825007 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 12:59:32 crc kubenswrapper[4724]: I1002 12:59:32.825019 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 12:59:32 crc kubenswrapper[4724]: I1002 12:59:32.825039 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 12:59:32 crc kubenswrapper[4724]: I1002 12:59:32.825052 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T12:59:32Z","lastTransitionTime":"2025-10-02T12:59:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 12:59:32 crc kubenswrapper[4724]: I1002 12:59:32.841733 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-829dv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b90b9e02-3565-4ad5-8f8c-eec339fc499c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://12d1d4558bef1a0a974bf27f9547c646e216ed5001227ab980e89b05c2a7b5ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vjwcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://55a18158dd67ef4667e4a6bc9684c4f4824f848b6e151a0f99c73e5ecfcc5ab1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://55a18158dd67ef4667e4a6bc9684c4f4824f848b6e151a0f99c73e5ecfcc5ab1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T12:59:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T12:59:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vjwcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://db711940b17fde8f8a00b44de8366688eb730facc06a174bf09456c12c673eb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://db711940b17fde8f8a00b44de8366688eb730facc06a174bf09456c12c673eb0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T12:59:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T12:59:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vjwcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a04ebb8ce267454f3068833092d9644e22484d6d3ea6c89e20fe396e5ea1fe9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a04ebb8ce267454f3068833092d9644e22484d6d3ea6c89e20fe396e5ea1fe9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T12:59:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T12:59:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vjwcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://771dcee276aac3571cff4769668348ea4a10d51e792d1604a47c57edac172dec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://771dcee276aac3571cff4769668348ea4a10d51e792d1604a47c57edac172dec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T12:59:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T12:59:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vjwcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c8b3f0e84cfea43942d32d84b09db7348002648527ea7d65a716489f8507eeb5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c8b3f0e84cfea43942d32d84b09db7348002648527ea7d65a716489f8507eeb5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T12:59:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T12:59:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vjwcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1271c0f5d096f2fc134e178807b44e3a5980ca67d64f0e0d73f566b348206a55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1271c0f5d096f2fc134e178807b44e3a5980ca67d64f0e0d73f566b348206a55\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T12:59:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T12:59:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vjwcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T12:59:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-829dv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:32Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:32 crc kubenswrapper[4724]: I1002 12:59:32.859702 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1527f3c8-7fa1-404b-aafd-7cac512df49d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:58:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:58:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:58:56Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:58:56Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:58:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f058a4a1cf35b7f4800f872139322f2b0096d67548945ba03ecfe6775ffd5103\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:58:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c06801afff9aa03ae9e0bcd8a966f454d81fdfe876806eed22e9d93132989dae\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:58:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff4c45bdd946a4d50c7dc93752d3d1a71b10b830611b64e138a5e1c0eb9e5e30\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:58:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a3585f00b9d5a19821986d13b0d46f0db3bdd6eef3b4e28f3a63449e0aa0e55\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a3585f00b9d5a19821986d13b0d46f0db3bdd6eef3b4e28f3a63449e0aa0e55\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-02T12:59:19Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1002 12:59:13.975124 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1002 12:59:13.976423 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1894517521/tls.crt::/tmp/serving-cert-1894517521/tls.key\\\\\\\"\\\\nI1002 12:59:19.332110 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1002 12:59:19.335107 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1002 12:59:19.335132 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1002 12:59:19.335161 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1002 12:59:19.335167 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1002 12:59:19.339404 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1002 12:59:19.339433 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 12:59:19.339437 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 12:59:19.339442 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1002 12:59:19.339446 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1002 12:59:19.339452 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1002 12:59:19.339454 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1002 12:59:19.339646 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1002 12:59:19.342508 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T12:59:02Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://25e74bd1660931f7bb8dc824f985c539abcbd4c706a11edf7192876b8796e751\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:58:59Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://366ed0b9a29ba0b7606888087909b51b39abd60dd6b11e6509b8613913461b6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://366ed0b9a29ba0b7606888087909b51b39abd60dd6b11e6509b8613913461b6b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T12:58:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T12:58:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T12:58:56Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:32Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:32 crc kubenswrapper[4724]: I1002 12:59:32.879418 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-w58lt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4089ad23-969c-4222-a8ed-e141ec291e80\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://197d101ac60dc4e22bd91f0bccaf0c5d9461d2bb94425b5c6d4ca75aad0f7e25\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sxqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3940d1762b165816101228f10689c4897ca3909b295c51368935e2d28d32e9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sxqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://300ea92580dcf51b685b378397b0dd067899d424f4394bce09a8242415acba64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sxqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6bc33e00e91c1177fe2762cb5295c78f4b00c1861d44446269e2e7668b25ac83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sxqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f4412d5281a9c03ea5b251d17371e363b9c5a970bad0db12adf3f75ddd1f955\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sxqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4186fbd2d282edbe581a4d8d3616dff86c3e6cdbe797999fa57f6034870e7742\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sxqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b14319f358f143317aa7440ec2f2b57507f3d05702bbbdba154b5f0bf845475\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1b14319f358f143317aa7440ec2f2b57507f3d05702bbbdba154b5f0bf845475\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-02T12:59:32Z\\\",\\\"message\\\":\\\"04 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI1002 12:59:32.014253 6004 handler.go:208] Removed *v1.Pod event handler 6\\\\nI1002 12:59:32.014591 6004 reflector.go:311] Stopping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI1002 12:59:32.014620 6004 handler.go:208] Removed *v1.Node event handler 2\\\\nI1002 12:59:32.014635 6004 handler.go:208] Removed *v1.Node event handler 7\\\\nI1002 12:59:32.014644 6004 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1002 12:59:32.014607 6004 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1002 12:59:32.014839 6004 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1002 12:59:32.015266 6004 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1002 12:59:32.015328 6004 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1002 12:59:32.015358 6004 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1002 12:59:32.015387 6004 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1002 12:59:32.015392 6004 factory.go:656] Stopping watch factory\\\\nI1002 12:59:32.015416 6004 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1002 12:59:32.015420 6004 ovnkube.go:599] Stopped ovnkube\\\\nI1002 12:59:3\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T12:59:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sxqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c6cd68ea9e76b6dcb1649fafb808953b30389fcaee674b611c655c239a36e7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sxqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9c7841b20007f0436754e669fb6f1a85de1f0ced1c0b2686ae0a20b56ab878f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9c7841b20007f0436754e669fb6f1a85de1f0ced1c0b2686ae0a20b56ab878f2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T12:59:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T12:59:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sxqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T12:59:21Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-w58lt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:32Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:32 crc kubenswrapper[4724]: I1002 12:59:32.927575 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 12:59:32 crc kubenswrapper[4724]: I1002 12:59:32.927623 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 12:59:32 crc kubenswrapper[4724]: I1002 12:59:32.927634 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 12:59:32 crc kubenswrapper[4724]: I1002 12:59:32.927655 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 12:59:32 crc kubenswrapper[4724]: I1002 12:59:32.927669 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T12:59:32Z","lastTransitionTime":"2025-10-02T12:59:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 12:59:33 crc kubenswrapper[4724]: I1002 12:59:33.030007 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 12:59:33 crc kubenswrapper[4724]: I1002 12:59:33.030333 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 12:59:33 crc kubenswrapper[4724]: I1002 12:59:33.030483 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 12:59:33 crc kubenswrapper[4724]: I1002 12:59:33.030609 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 12:59:33 crc kubenswrapper[4724]: I1002 12:59:33.030700 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T12:59:33Z","lastTransitionTime":"2025-10-02T12:59:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 12:59:33 crc kubenswrapper[4724]: I1002 12:59:33.089475 4724 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-z9692"] Oct 02 12:59:33 crc kubenswrapper[4724]: I1002 12:59:33.090021 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-z9692" Oct 02 12:59:33 crc kubenswrapper[4724]: I1002 12:59:33.093338 4724 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Oct 02 12:59:33 crc kubenswrapper[4724]: I1002 12:59:33.096905 4724 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Oct 02 12:59:33 crc kubenswrapper[4724]: I1002 12:59:33.110145 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-z9692" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aeccba6f-b4bb-4dd5-8ab1-798d7a67251a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:33Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:33Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znd9t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znd9t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T12:59:33Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-z9692\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:33Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:33 crc kubenswrapper[4724]: I1002 12:59:33.133500 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 12:59:33 crc kubenswrapper[4724]: I1002 12:59:33.133942 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 12:59:33 crc kubenswrapper[4724]: I1002 12:59:33.134024 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 12:59:33 crc kubenswrapper[4724]: I1002 12:59:33.134146 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 12:59:33 crc kubenswrapper[4724]: I1002 12:59:33.134225 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T12:59:33Z","lastTransitionTime":"2025-10-02T12:59:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 12:59:33 crc kubenswrapper[4724]: I1002 12:59:33.166160 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8902618f9e9502ff68d0658e199abb68025c551774c41022ed9f3f15dcccba47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:33Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:33 crc kubenswrapper[4724]: I1002 12:59:33.178594 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/aeccba6f-b4bb-4dd5-8ab1-798d7a67251a-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-z9692\" (UID: \"aeccba6f-b4bb-4dd5-8ab1-798d7a67251a\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-z9692" Oct 02 12:59:33 crc kubenswrapper[4724]: I1002 12:59:33.178674 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/aeccba6f-b4bb-4dd5-8ab1-798d7a67251a-env-overrides\") pod \"ovnkube-control-plane-749d76644c-z9692\" (UID: \"aeccba6f-b4bb-4dd5-8ab1-798d7a67251a\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-z9692" Oct 02 12:59:33 crc kubenswrapper[4724]: I1002 12:59:33.178771 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-znd9t\" (UniqueName: \"kubernetes.io/projected/aeccba6f-b4bb-4dd5-8ab1-798d7a67251a-kube-api-access-znd9t\") pod \"ovnkube-control-plane-749d76644c-z9692\" (UID: \"aeccba6f-b4bb-4dd5-8ab1-798d7a67251a\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-z9692" Oct 02 12:59:33 crc kubenswrapper[4724]: I1002 12:59:33.178856 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/aeccba6f-b4bb-4dd5-8ab1-798d7a67251a-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-z9692\" (UID: \"aeccba6f-b4bb-4dd5-8ab1-798d7a67251a\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-z9692" Oct 02 12:59:33 crc kubenswrapper[4724]: I1002 12:59:33.183124 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:33Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:33 crc kubenswrapper[4724]: I1002 12:59:33.198127 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:19Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:33Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:33 crc kubenswrapper[4724]: I1002 12:59:33.214621 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-pr276" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c67e3c27-01fd-47f5-a7fb-a2ae1e7753d4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5fe9713cb9b004911140ef4802a6755a4d2a781fde439d7318ace93e4b608cef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c844q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T12:59:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-pr276\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:33Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:33 crc kubenswrapper[4724]: I1002 12:59:33.227197 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-vv7gr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"58e6c559-83ff-48ec-b337-ddd00852bc3c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://540350e6cb29bb395f31338d08626d3b110cc3adbe0df070045d178281d9c035\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4spmh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T12:59:26Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-vv7gr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:33Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:33 crc kubenswrapper[4724]: I1002 12:59:33.237271 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 12:59:33 crc kubenswrapper[4724]: I1002 12:59:33.237620 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 12:59:33 crc kubenswrapper[4724]: I1002 12:59:33.237696 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 12:59:33 crc kubenswrapper[4724]: I1002 12:59:33.237769 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 12:59:33 crc kubenswrapper[4724]: I1002 12:59:33.237865 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T12:59:33Z","lastTransitionTime":"2025-10-02T12:59:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 12:59:33 crc kubenswrapper[4724]: I1002 12:59:33.247032 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"92a0e520-2ed8-4d81-8de4-c0f98916b012\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:58:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:58:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:58:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8d0d24fc1f56963b40bca48e3e923a5b35f9bec48f0b4fabbbdc9163762b2aa6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:58:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cdf23aa5cabd12ba0af299a7184df0341b5a7f7fb6f8159a10c932599fcef08b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:58:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a087f1c52a561564ac405ef652e8cb1542077e507acff22e1189f7370b7ca322\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:58:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a675aca2218d07044b6e9a391f2b60f5cc822790011af97b0c0b704c1bd98d78\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:58:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T12:58:56Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:33Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:33 crc kubenswrapper[4724]: I1002 12:59:33.261543 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-2mrjk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e0d209f5-ad55-48f5-b4de-51aa5a972c19\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e94572c741d70917793b2482faf4c7fba57dfc4a29ade8824b35109735b602c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8f48\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T12:59:21Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-2mrjk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:33Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:33 crc kubenswrapper[4724]: I1002 12:59:33.276949 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df76441bbfa05afa242a77e10e6f855d811a9432b790630a9fa5f359ccb6d457\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:33Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:33 crc kubenswrapper[4724]: I1002 12:59:33.280306 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/aeccba6f-b4bb-4dd5-8ab1-798d7a67251a-env-overrides\") pod \"ovnkube-control-plane-749d76644c-z9692\" (UID: \"aeccba6f-b4bb-4dd5-8ab1-798d7a67251a\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-z9692" Oct 02 12:59:33 crc kubenswrapper[4724]: I1002 12:59:33.280386 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-znd9t\" (UniqueName: \"kubernetes.io/projected/aeccba6f-b4bb-4dd5-8ab1-798d7a67251a-kube-api-access-znd9t\") pod \"ovnkube-control-plane-749d76644c-z9692\" (UID: \"aeccba6f-b4bb-4dd5-8ab1-798d7a67251a\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-z9692" Oct 02 12:59:33 crc kubenswrapper[4724]: I1002 12:59:33.280411 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/aeccba6f-b4bb-4dd5-8ab1-798d7a67251a-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-z9692\" (UID: \"aeccba6f-b4bb-4dd5-8ab1-798d7a67251a\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-z9692" Oct 02 12:59:33 crc kubenswrapper[4724]: I1002 12:59:33.280433 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/aeccba6f-b4bb-4dd5-8ab1-798d7a67251a-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-z9692\" (UID: \"aeccba6f-b4bb-4dd5-8ab1-798d7a67251a\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-z9692" Oct 02 12:59:33 crc kubenswrapper[4724]: I1002 12:59:33.281395 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/aeccba6f-b4bb-4dd5-8ab1-798d7a67251a-env-overrides\") pod \"ovnkube-control-plane-749d76644c-z9692\" (UID: \"aeccba6f-b4bb-4dd5-8ab1-798d7a67251a\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-z9692" Oct 02 12:59:33 crc kubenswrapper[4724]: I1002 12:59:33.281498 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/aeccba6f-b4bb-4dd5-8ab1-798d7a67251a-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-z9692\" (UID: \"aeccba6f-b4bb-4dd5-8ab1-798d7a67251a\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-z9692" Oct 02 12:59:33 crc kubenswrapper[4724]: I1002 12:59:33.297478 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/aeccba6f-b4bb-4dd5-8ab1-798d7a67251a-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-z9692\" (UID: \"aeccba6f-b4bb-4dd5-8ab1-798d7a67251a\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-z9692" Oct 02 12:59:33 crc kubenswrapper[4724]: I1002 12:59:33.309446 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-znd9t\" (UniqueName: \"kubernetes.io/projected/aeccba6f-b4bb-4dd5-8ab1-798d7a67251a-kube-api-access-znd9t\") pod \"ovnkube-control-plane-749d76644c-z9692\" (UID: \"aeccba6f-b4bb-4dd5-8ab1-798d7a67251a\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-z9692" Oct 02 12:59:33 crc kubenswrapper[4724]: I1002 12:59:33.311850 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://16ca55ea7e67fb6219b1c835d960d1f9ec6a18a98789ee961e44a278fad9add7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://72eefb1c1aad15d7ad34aa4384c7d0a20a4c886225cb7d28f0597cf01ad35693\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:33Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:33 crc kubenswrapper[4724]: I1002 12:59:33.312947 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 12:59:33 crc kubenswrapper[4724]: I1002 12:59:33.313063 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 12:59:33 crc kubenswrapper[4724]: I1002 12:59:33.312952 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 12:59:33 crc kubenswrapper[4724]: E1002 12:59:33.313263 4724 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 02 12:59:33 crc kubenswrapper[4724]: E1002 12:59:33.313085 4724 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 02 12:59:33 crc kubenswrapper[4724]: E1002 12:59:33.313595 4724 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 02 12:59:33 crc kubenswrapper[4724]: I1002 12:59:33.331653 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:19Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:33Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:33 crc kubenswrapper[4724]: I1002 12:59:33.341659 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 12:59:33 crc kubenswrapper[4724]: I1002 12:59:33.341712 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 12:59:33 crc kubenswrapper[4724]: I1002 12:59:33.341725 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 12:59:33 crc kubenswrapper[4724]: I1002 12:59:33.341748 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 12:59:33 crc kubenswrapper[4724]: I1002 12:59:33.341762 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T12:59:33Z","lastTransitionTime":"2025-10-02T12:59:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 12:59:33 crc kubenswrapper[4724]: I1002 12:59:33.355069 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-829dv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b90b9e02-3565-4ad5-8f8c-eec339fc499c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://12d1d4558bef1a0a974bf27f9547c646e216ed5001227ab980e89b05c2a7b5ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vjwcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://55a18158dd67ef4667e4a6bc9684c4f4824f848b6e151a0f99c73e5ecfcc5ab1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://55a18158dd67ef4667e4a6bc9684c4f4824f848b6e151a0f99c73e5ecfcc5ab1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T12:59:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T12:59:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vjwcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://db711940b17fde8f8a00b44de8366688eb730facc06a174bf09456c12c673eb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://db711940b17fde8f8a00b44de8366688eb730facc06a174bf09456c12c673eb0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T12:59:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T12:59:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vjwcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a04ebb8ce267454f3068833092d9644e22484d6d3ea6c89e20fe396e5ea1fe9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a04ebb8ce267454f3068833092d9644e22484d6d3ea6c89e20fe396e5ea1fe9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T12:59:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T12:59:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vjwcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://771dcee276aac3571cff4769668348ea4a10d51e792d1604a47c57edac172dec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://771dcee276aac3571cff4769668348ea4a10d51e792d1604a47c57edac172dec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T12:59:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T12:59:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vjwcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c8b3f0e84cfea43942d32d84b09db7348002648527ea7d65a716489f8507eeb5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c8b3f0e84cfea43942d32d84b09db7348002648527ea7d65a716489f8507eeb5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T12:59:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T12:59:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vjwcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1271c0f5d096f2fc134e178807b44e3a5980ca67d64f0e0d73f566b348206a55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1271c0f5d096f2fc134e178807b44e3a5980ca67d64f0e0d73f566b348206a55\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T12:59:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T12:59:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vjwcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T12:59:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-829dv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:33Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:33 crc kubenswrapper[4724]: I1002 12:59:33.378752 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-74k4t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6090eaa-c182-4788-950c-16352c271233\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a3396cd890f4ef8af351a4a3065c2324c20bddc7e079bb78e5a02e9fc8269c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fb6lr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4d5dbca156adf53ca2d3b237e15b6dd4387249e73dbf1a7fb3e7e39c53d1b548\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fb6lr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T12:59:21Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-74k4t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:33Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:33 crc kubenswrapper[4724]: I1002 12:59:33.399223 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1527f3c8-7fa1-404b-aafd-7cac512df49d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:58:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:58:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:58:56Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:58:56Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:58:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f058a4a1cf35b7f4800f872139322f2b0096d67548945ba03ecfe6775ffd5103\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:58:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c06801afff9aa03ae9e0bcd8a966f454d81fdfe876806eed22e9d93132989dae\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:58:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff4c45bdd946a4d50c7dc93752d3d1a71b10b830611b64e138a5e1c0eb9e5e30\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:58:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a3585f00b9d5a19821986d13b0d46f0db3bdd6eef3b4e28f3a63449e0aa0e55\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a3585f00b9d5a19821986d13b0d46f0db3bdd6eef3b4e28f3a63449e0aa0e55\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-02T12:59:19Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1002 12:59:13.975124 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1002 12:59:13.976423 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1894517521/tls.crt::/tmp/serving-cert-1894517521/tls.key\\\\\\\"\\\\nI1002 12:59:19.332110 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1002 12:59:19.335107 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1002 12:59:19.335132 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1002 12:59:19.335161 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1002 12:59:19.335167 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1002 12:59:19.339404 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1002 12:59:19.339433 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 12:59:19.339437 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 12:59:19.339442 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1002 12:59:19.339446 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1002 12:59:19.339452 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1002 12:59:19.339454 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1002 12:59:19.339646 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1002 12:59:19.342508 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T12:59:02Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://25e74bd1660931f7bb8dc824f985c539abcbd4c706a11edf7192876b8796e751\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:58:59Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://366ed0b9a29ba0b7606888087909b51b39abd60dd6b11e6509b8613913461b6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://366ed0b9a29ba0b7606888087909b51b39abd60dd6b11e6509b8613913461b6b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T12:58:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T12:58:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T12:58:56Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:33Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:33 crc kubenswrapper[4724]: I1002 12:59:33.403822 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-z9692" Oct 02 12:59:33 crc kubenswrapper[4724]: W1002 12:59:33.419745 4724 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podaeccba6f_b4bb_4dd5_8ab1_798d7a67251a.slice/crio-28e9eb82ddc10f63c9692a3d560a08e4b5deb5396f869ada483d8b2d90989b49 WatchSource:0}: Error finding container 28e9eb82ddc10f63c9692a3d560a08e4b5deb5396f869ada483d8b2d90989b49: Status 404 returned error can't find the container with id 28e9eb82ddc10f63c9692a3d560a08e4b5deb5396f869ada483d8b2d90989b49 Oct 02 12:59:33 crc kubenswrapper[4724]: I1002 12:59:33.430992 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-w58lt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4089ad23-969c-4222-a8ed-e141ec291e80\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://197d101ac60dc4e22bd91f0bccaf0c5d9461d2bb94425b5c6d4ca75aad0f7e25\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sxqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3940d1762b165816101228f10689c4897ca3909b295c51368935e2d28d32e9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sxqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://300ea92580dcf51b685b378397b0dd067899d424f4394bce09a8242415acba64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sxqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6bc33e00e91c1177fe2762cb5295c78f4b00c1861d44446269e2e7668b25ac83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sxqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f4412d5281a9c03ea5b251d17371e363b9c5a970bad0db12adf3f75ddd1f955\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sxqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4186fbd2d282edbe581a4d8d3616dff86c3e6cdbe797999fa57f6034870e7742\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sxqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b14319f358f143317aa7440ec2f2b57507f3d05702bbbdba154b5f0bf845475\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1b14319f358f143317aa7440ec2f2b57507f3d05702bbbdba154b5f0bf845475\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-02T12:59:32Z\\\",\\\"message\\\":\\\"04 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI1002 12:59:32.014253 6004 handler.go:208] Removed *v1.Pod event handler 6\\\\nI1002 12:59:32.014591 6004 reflector.go:311] Stopping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI1002 12:59:32.014620 6004 handler.go:208] Removed *v1.Node event handler 2\\\\nI1002 12:59:32.014635 6004 handler.go:208] Removed *v1.Node event handler 7\\\\nI1002 12:59:32.014644 6004 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1002 12:59:32.014607 6004 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1002 12:59:32.014839 6004 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1002 12:59:32.015266 6004 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1002 12:59:32.015328 6004 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1002 12:59:32.015358 6004 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1002 12:59:32.015387 6004 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1002 12:59:32.015392 6004 factory.go:656] Stopping watch factory\\\\nI1002 12:59:32.015416 6004 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1002 12:59:32.015420 6004 ovnkube.go:599] Stopped ovnkube\\\\nI1002 12:59:3\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T12:59:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sxqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c6cd68ea9e76b6dcb1649fafb808953b30389fcaee674b611c655c239a36e7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sxqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9c7841b20007f0436754e669fb6f1a85de1f0ced1c0b2686ae0a20b56ab878f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9c7841b20007f0436754e669fb6f1a85de1f0ced1c0b2686ae0a20b56ab878f2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T12:59:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T12:59:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sxqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T12:59:21Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-w58lt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:33Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:33 crc kubenswrapper[4724]: I1002 12:59:33.444979 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 12:59:33 crc kubenswrapper[4724]: I1002 12:59:33.445033 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 12:59:33 crc kubenswrapper[4724]: I1002 12:59:33.445045 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 12:59:33 crc kubenswrapper[4724]: I1002 12:59:33.445070 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 12:59:33 crc kubenswrapper[4724]: I1002 12:59:33.445083 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T12:59:33Z","lastTransitionTime":"2025-10-02T12:59:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 12:59:33 crc kubenswrapper[4724]: I1002 12:59:33.548520 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 12:59:33 crc kubenswrapper[4724]: I1002 12:59:33.548590 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 12:59:33 crc kubenswrapper[4724]: I1002 12:59:33.548599 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 12:59:33 crc kubenswrapper[4724]: I1002 12:59:33.548616 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 12:59:33 crc kubenswrapper[4724]: I1002 12:59:33.548627 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T12:59:33Z","lastTransitionTime":"2025-10-02T12:59:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 12:59:33 crc kubenswrapper[4724]: I1002 12:59:33.637562 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-z9692" event={"ID":"aeccba6f-b4bb-4dd5-8ab1-798d7a67251a","Type":"ContainerStarted","Data":"28e9eb82ddc10f63c9692a3d560a08e4b5deb5396f869ada483d8b2d90989b49"} Oct 02 12:59:33 crc kubenswrapper[4724]: I1002 12:59:33.640495 4724 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-w58lt_4089ad23-969c-4222-a8ed-e141ec291e80/ovnkube-controller/0.log" Oct 02 12:59:33 crc kubenswrapper[4724]: I1002 12:59:33.644152 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-w58lt" event={"ID":"4089ad23-969c-4222-a8ed-e141ec291e80","Type":"ContainerStarted","Data":"b0f2abd22b65316a586899f71ac99b5451e97b96ff0c19ac2aabe1bb9002e7cf"} Oct 02 12:59:33 crc kubenswrapper[4724]: I1002 12:59:33.644382 4724 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 02 12:59:33 crc kubenswrapper[4724]: I1002 12:59:33.650764 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 12:59:33 crc kubenswrapper[4724]: I1002 12:59:33.650833 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 12:59:33 crc kubenswrapper[4724]: I1002 12:59:33.650842 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 12:59:33 crc kubenswrapper[4724]: I1002 12:59:33.650876 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 12:59:33 crc kubenswrapper[4724]: I1002 12:59:33.650888 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T12:59:33Z","lastTransitionTime":"2025-10-02T12:59:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 12:59:33 crc kubenswrapper[4724]: I1002 12:59:33.664553 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:33Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:33 crc kubenswrapper[4724]: I1002 12:59:33.680367 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:19Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:33Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:33 crc kubenswrapper[4724]: I1002 12:59:33.696777 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-pr276" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c67e3c27-01fd-47f5-a7fb-a2ae1e7753d4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5fe9713cb9b004911140ef4802a6755a4d2a781fde439d7318ace93e4b608cef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c844q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T12:59:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-pr276\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:33Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:33 crc kubenswrapper[4724]: I1002 12:59:33.711488 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-vv7gr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"58e6c559-83ff-48ec-b337-ddd00852bc3c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://540350e6cb29bb395f31338d08626d3b110cc3adbe0df070045d178281d9c035\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4spmh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T12:59:26Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-vv7gr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:33Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:33 crc kubenswrapper[4724]: I1002 12:59:33.724793 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-z9692" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aeccba6f-b4bb-4dd5-8ab1-798d7a67251a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:33Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:33Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znd9t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znd9t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T12:59:33Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-z9692\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:33Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:33 crc kubenswrapper[4724]: I1002 12:59:33.744009 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8902618f9e9502ff68d0658e199abb68025c551774c41022ed9f3f15dcccba47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:33Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:33 crc kubenswrapper[4724]: I1002 12:59:33.754425 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 12:59:33 crc kubenswrapper[4724]: I1002 12:59:33.754461 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 12:59:33 crc kubenswrapper[4724]: I1002 12:59:33.754470 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 12:59:33 crc kubenswrapper[4724]: I1002 12:59:33.754485 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 12:59:33 crc kubenswrapper[4724]: I1002 12:59:33.754495 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T12:59:33Z","lastTransitionTime":"2025-10-02T12:59:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 12:59:33 crc kubenswrapper[4724]: I1002 12:59:33.758156 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-2mrjk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e0d209f5-ad55-48f5-b4de-51aa5a972c19\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e94572c741d70917793b2482faf4c7fba57dfc4a29ade8824b35109735b602c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8f48\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T12:59:21Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-2mrjk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:33Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:33 crc kubenswrapper[4724]: I1002 12:59:33.776385 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"92a0e520-2ed8-4d81-8de4-c0f98916b012\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:58:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:58:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:58:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8d0d24fc1f56963b40bca48e3e923a5b35f9bec48f0b4fabbbdc9163762b2aa6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:58:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cdf23aa5cabd12ba0af299a7184df0341b5a7f7fb6f8159a10c932599fcef08b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:58:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a087f1c52a561564ac405ef652e8cb1542077e507acff22e1189f7370b7ca322\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:58:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a675aca2218d07044b6e9a391f2b60f5cc822790011af97b0c0b704c1bd98d78\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:58:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T12:58:56Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:33Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:33 crc kubenswrapper[4724]: I1002 12:59:33.792089 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://16ca55ea7e67fb6219b1c835d960d1f9ec6a18a98789ee961e44a278fad9add7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://72eefb1c1aad15d7ad34aa4384c7d0a20a4c886225cb7d28f0597cf01ad35693\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:33Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:33 crc kubenswrapper[4724]: I1002 12:59:33.807945 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:19Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:33Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:33 crc kubenswrapper[4724]: I1002 12:59:33.825133 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-829dv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b90b9e02-3565-4ad5-8f8c-eec339fc499c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://12d1d4558bef1a0a974bf27f9547c646e216ed5001227ab980e89b05c2a7b5ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vjwcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://55a18158dd67ef4667e4a6bc9684c4f4824f848b6e151a0f99c73e5ecfcc5ab1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://55a18158dd67ef4667e4a6bc9684c4f4824f848b6e151a0f99c73e5ecfcc5ab1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T12:59:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T12:59:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vjwcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://db711940b17fde8f8a00b44de8366688eb730facc06a174bf09456c12c673eb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://db711940b17fde8f8a00b44de8366688eb730facc06a174bf09456c12c673eb0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T12:59:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T12:59:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vjwcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a04ebb8ce267454f3068833092d9644e22484d6d3ea6c89e20fe396e5ea1fe9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a04ebb8ce267454f3068833092d9644e22484d6d3ea6c89e20fe396e5ea1fe9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T12:59:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T12:59:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vjwcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://771dcee276aac3571cff4769668348ea4a10d51e792d1604a47c57edac172dec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://771dcee276aac3571cff4769668348ea4a10d51e792d1604a47c57edac172dec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T12:59:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T12:59:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vjwcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c8b3f0e84cfea43942d32d84b09db7348002648527ea7d65a716489f8507eeb5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c8b3f0e84cfea43942d32d84b09db7348002648527ea7d65a716489f8507eeb5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T12:59:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T12:59:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vjwcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1271c0f5d096f2fc134e178807b44e3a5980ca67d64f0e0d73f566b348206a55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1271c0f5d096f2fc134e178807b44e3a5980ca67d64f0e0d73f566b348206a55\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T12:59:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T12:59:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vjwcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T12:59:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-829dv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:33Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:33 crc kubenswrapper[4724]: I1002 12:59:33.838521 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-74k4t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6090eaa-c182-4788-950c-16352c271233\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a3396cd890f4ef8af351a4a3065c2324c20bddc7e079bb78e5a02e9fc8269c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fb6lr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4d5dbca156adf53ca2d3b237e15b6dd4387249e73dbf1a7fb3e7e39c53d1b548\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fb6lr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T12:59:21Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-74k4t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:33Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:33 crc kubenswrapper[4724]: I1002 12:59:33.856205 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df76441bbfa05afa242a77e10e6f855d811a9432b790630a9fa5f359ccb6d457\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:33Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:33 crc kubenswrapper[4724]: I1002 12:59:33.857186 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 12:59:33 crc kubenswrapper[4724]: I1002 12:59:33.857212 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 12:59:33 crc kubenswrapper[4724]: I1002 12:59:33.857224 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 12:59:33 crc kubenswrapper[4724]: I1002 12:59:33.857241 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 12:59:33 crc kubenswrapper[4724]: I1002 12:59:33.857254 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T12:59:33Z","lastTransitionTime":"2025-10-02T12:59:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 12:59:33 crc kubenswrapper[4724]: I1002 12:59:33.873489 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1527f3c8-7fa1-404b-aafd-7cac512df49d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:58:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:58:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:58:56Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:58:56Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:58:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f058a4a1cf35b7f4800f872139322f2b0096d67548945ba03ecfe6775ffd5103\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:58:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c06801afff9aa03ae9e0bcd8a966f454d81fdfe876806eed22e9d93132989dae\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:58:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff4c45bdd946a4d50c7dc93752d3d1a71b10b830611b64e138a5e1c0eb9e5e30\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:58:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a3585f00b9d5a19821986d13b0d46f0db3bdd6eef3b4e28f3a63449e0aa0e55\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a3585f00b9d5a19821986d13b0d46f0db3bdd6eef3b4e28f3a63449e0aa0e55\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-02T12:59:19Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1002 12:59:13.975124 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1002 12:59:13.976423 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1894517521/tls.crt::/tmp/serving-cert-1894517521/tls.key\\\\\\\"\\\\nI1002 12:59:19.332110 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1002 12:59:19.335107 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1002 12:59:19.335132 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1002 12:59:19.335161 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1002 12:59:19.335167 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1002 12:59:19.339404 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1002 12:59:19.339433 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 12:59:19.339437 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 12:59:19.339442 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1002 12:59:19.339446 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1002 12:59:19.339452 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1002 12:59:19.339454 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1002 12:59:19.339646 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1002 12:59:19.342508 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T12:59:02Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://25e74bd1660931f7bb8dc824f985c539abcbd4c706a11edf7192876b8796e751\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:58:59Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://366ed0b9a29ba0b7606888087909b51b39abd60dd6b11e6509b8613913461b6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://366ed0b9a29ba0b7606888087909b51b39abd60dd6b11e6509b8613913461b6b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T12:58:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T12:58:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T12:58:56Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:33Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:33 crc kubenswrapper[4724]: I1002 12:59:33.895011 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-w58lt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4089ad23-969c-4222-a8ed-e141ec291e80\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://197d101ac60dc4e22bd91f0bccaf0c5d9461d2bb94425b5c6d4ca75aad0f7e25\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sxqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3940d1762b165816101228f10689c4897ca3909b295c51368935e2d28d32e9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sxqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://300ea92580dcf51b685b378397b0dd067899d424f4394bce09a8242415acba64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sxqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6bc33e00e91c1177fe2762cb5295c78f4b00c1861d44446269e2e7668b25ac83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sxqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f4412d5281a9c03ea5b251d17371e363b9c5a970bad0db12adf3f75ddd1f955\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sxqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4186fbd2d282edbe581a4d8d3616dff86c3e6cdbe797999fa57f6034870e7742\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sxqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b0f2abd22b65316a586899f71ac99b5451e97b96ff0c19ac2aabe1bb9002e7cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1b14319f358f143317aa7440ec2f2b57507f3d05702bbbdba154b5f0bf845475\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-02T12:59:32Z\\\",\\\"message\\\":\\\"04 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI1002 12:59:32.014253 6004 handler.go:208] Removed *v1.Pod event handler 6\\\\nI1002 12:59:32.014591 6004 reflector.go:311] Stopping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI1002 12:59:32.014620 6004 handler.go:208] Removed *v1.Node event handler 2\\\\nI1002 12:59:32.014635 6004 handler.go:208] Removed *v1.Node event handler 7\\\\nI1002 12:59:32.014644 6004 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1002 12:59:32.014607 6004 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1002 12:59:32.014839 6004 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1002 12:59:32.015266 6004 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1002 12:59:32.015328 6004 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1002 12:59:32.015358 6004 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1002 12:59:32.015387 6004 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1002 12:59:32.015392 6004 factory.go:656] Stopping watch factory\\\\nI1002 12:59:32.015416 6004 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1002 12:59:32.015420 6004 ovnkube.go:599] Stopped ovnkube\\\\nI1002 12:59:3\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T12:59:27Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sxqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c6cd68ea9e76b6dcb1649fafb808953b30389fcaee674b611c655c239a36e7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sxqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9c7841b20007f0436754e669fb6f1a85de1f0ced1c0b2686ae0a20b56ab878f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9c7841b20007f0436754e669fb6f1a85de1f0ced1c0b2686ae0a20b56ab878f2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T12:59:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T12:59:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sxqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T12:59:21Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-w58lt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:33Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:33 crc kubenswrapper[4724]: I1002 12:59:33.960629 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 12:59:33 crc kubenswrapper[4724]: I1002 12:59:33.960682 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 12:59:33 crc kubenswrapper[4724]: I1002 12:59:33.960691 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 12:59:33 crc kubenswrapper[4724]: I1002 12:59:33.960708 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 12:59:33 crc kubenswrapper[4724]: I1002 12:59:33.960720 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T12:59:33Z","lastTransitionTime":"2025-10-02T12:59:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 12:59:34 crc kubenswrapper[4724]: I1002 12:59:34.063297 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 12:59:34 crc kubenswrapper[4724]: I1002 12:59:34.063330 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 12:59:34 crc kubenswrapper[4724]: I1002 12:59:34.063340 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 12:59:34 crc kubenswrapper[4724]: I1002 12:59:34.063355 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 12:59:34 crc kubenswrapper[4724]: I1002 12:59:34.063365 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T12:59:34Z","lastTransitionTime":"2025-10-02T12:59:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 12:59:34 crc kubenswrapper[4724]: I1002 12:59:34.166604 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 12:59:34 crc kubenswrapper[4724]: I1002 12:59:34.166655 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 12:59:34 crc kubenswrapper[4724]: I1002 12:59:34.166667 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 12:59:34 crc kubenswrapper[4724]: I1002 12:59:34.166684 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 12:59:34 crc kubenswrapper[4724]: I1002 12:59:34.166697 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T12:59:34Z","lastTransitionTime":"2025-10-02T12:59:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 12:59:34 crc kubenswrapper[4724]: I1002 12:59:34.269583 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 12:59:34 crc kubenswrapper[4724]: I1002 12:59:34.269647 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 12:59:34 crc kubenswrapper[4724]: I1002 12:59:34.269661 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 12:59:34 crc kubenswrapper[4724]: I1002 12:59:34.269686 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 12:59:34 crc kubenswrapper[4724]: I1002 12:59:34.269700 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T12:59:34Z","lastTransitionTime":"2025-10-02T12:59:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 12:59:34 crc kubenswrapper[4724]: I1002 12:59:34.372774 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 12:59:34 crc kubenswrapper[4724]: I1002 12:59:34.372834 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 12:59:34 crc kubenswrapper[4724]: I1002 12:59:34.372843 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 12:59:34 crc kubenswrapper[4724]: I1002 12:59:34.372867 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 12:59:34 crc kubenswrapper[4724]: I1002 12:59:34.372882 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T12:59:34Z","lastTransitionTime":"2025-10-02T12:59:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 12:59:34 crc kubenswrapper[4724]: I1002 12:59:34.476438 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 12:59:34 crc kubenswrapper[4724]: I1002 12:59:34.476486 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 12:59:34 crc kubenswrapper[4724]: I1002 12:59:34.476498 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 12:59:34 crc kubenswrapper[4724]: I1002 12:59:34.476515 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 12:59:34 crc kubenswrapper[4724]: I1002 12:59:34.476525 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T12:59:34Z","lastTransitionTime":"2025-10-02T12:59:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 12:59:34 crc kubenswrapper[4724]: I1002 12:59:34.579852 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 12:59:34 crc kubenswrapper[4724]: I1002 12:59:34.579912 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 12:59:34 crc kubenswrapper[4724]: I1002 12:59:34.579925 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 12:59:34 crc kubenswrapper[4724]: I1002 12:59:34.579946 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 12:59:34 crc kubenswrapper[4724]: I1002 12:59:34.579961 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T12:59:34Z","lastTransitionTime":"2025-10-02T12:59:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 12:59:34 crc kubenswrapper[4724]: I1002 12:59:34.650810 4724 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-w58lt_4089ad23-969c-4222-a8ed-e141ec291e80/ovnkube-controller/1.log" Oct 02 12:59:34 crc kubenswrapper[4724]: I1002 12:59:34.651418 4724 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-w58lt_4089ad23-969c-4222-a8ed-e141ec291e80/ovnkube-controller/0.log" Oct 02 12:59:34 crc kubenswrapper[4724]: I1002 12:59:34.654353 4724 generic.go:334] "Generic (PLEG): container finished" podID="4089ad23-969c-4222-a8ed-e141ec291e80" containerID="b0f2abd22b65316a586899f71ac99b5451e97b96ff0c19ac2aabe1bb9002e7cf" exitCode=1 Oct 02 12:59:34 crc kubenswrapper[4724]: I1002 12:59:34.654445 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-w58lt" event={"ID":"4089ad23-969c-4222-a8ed-e141ec291e80","Type":"ContainerDied","Data":"b0f2abd22b65316a586899f71ac99b5451e97b96ff0c19ac2aabe1bb9002e7cf"} Oct 02 12:59:34 crc kubenswrapper[4724]: I1002 12:59:34.654519 4724 scope.go:117] "RemoveContainer" containerID="1b14319f358f143317aa7440ec2f2b57507f3d05702bbbdba154b5f0bf845475" Oct 02 12:59:34 crc kubenswrapper[4724]: I1002 12:59:34.655362 4724 scope.go:117] "RemoveContainer" containerID="b0f2abd22b65316a586899f71ac99b5451e97b96ff0c19ac2aabe1bb9002e7cf" Oct 02 12:59:34 crc kubenswrapper[4724]: E1002 12:59:34.655580 4724 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-w58lt_openshift-ovn-kubernetes(4089ad23-969c-4222-a8ed-e141ec291e80)\"" pod="openshift-ovn-kubernetes/ovnkube-node-w58lt" podUID="4089ad23-969c-4222-a8ed-e141ec291e80" Oct 02 12:59:34 crc kubenswrapper[4724]: I1002 12:59:34.656502 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-z9692" event={"ID":"aeccba6f-b4bb-4dd5-8ab1-798d7a67251a","Type":"ContainerStarted","Data":"116acf2f64bac5bb00b87c2bf09f16dcec811d4effbacb2ad49e4d2656180b54"} Oct 02 12:59:34 crc kubenswrapper[4724]: I1002 12:59:34.672529 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"92a0e520-2ed8-4d81-8de4-c0f98916b012\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:58:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:58:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:58:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8d0d24fc1f56963b40bca48e3e923a5b35f9bec48f0b4fabbbdc9163762b2aa6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:58:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cdf23aa5cabd12ba0af299a7184df0341b5a7f7fb6f8159a10c932599fcef08b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:58:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a087f1c52a561564ac405ef652e8cb1542077e507acff22e1189f7370b7ca322\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:58:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a675aca2218d07044b6e9a391f2b60f5cc822790011af97b0c0b704c1bd98d78\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:58:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T12:58:56Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:34Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:34 crc kubenswrapper[4724]: I1002 12:59:34.682600 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 12:59:34 crc kubenswrapper[4724]: I1002 12:59:34.682659 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 12:59:34 crc kubenswrapper[4724]: I1002 12:59:34.682670 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 12:59:34 crc kubenswrapper[4724]: I1002 12:59:34.682691 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 12:59:34 crc kubenswrapper[4724]: I1002 12:59:34.682704 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T12:59:34Z","lastTransitionTime":"2025-10-02T12:59:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 12:59:34 crc kubenswrapper[4724]: I1002 12:59:34.685342 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-2mrjk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e0d209f5-ad55-48f5-b4de-51aa5a972c19\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e94572c741d70917793b2482faf4c7fba57dfc4a29ade8824b35109735b602c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8f48\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T12:59:21Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-2mrjk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:34Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:34 crc kubenswrapper[4724]: I1002 12:59:34.707626 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-829dv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b90b9e02-3565-4ad5-8f8c-eec339fc499c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://12d1d4558bef1a0a974bf27f9547c646e216ed5001227ab980e89b05c2a7b5ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vjwcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://55a18158dd67ef4667e4a6bc9684c4f4824f848b6e151a0f99c73e5ecfcc5ab1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://55a18158dd67ef4667e4a6bc9684c4f4824f848b6e151a0f99c73e5ecfcc5ab1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T12:59:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T12:59:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vjwcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://db711940b17fde8f8a00b44de8366688eb730facc06a174bf09456c12c673eb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://db711940b17fde8f8a00b44de8366688eb730facc06a174bf09456c12c673eb0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T12:59:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T12:59:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vjwcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a04ebb8ce267454f3068833092d9644e22484d6d3ea6c89e20fe396e5ea1fe9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a04ebb8ce267454f3068833092d9644e22484d6d3ea6c89e20fe396e5ea1fe9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T12:59:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T12:59:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vjwcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://771dcee276aac3571cff4769668348ea4a10d51e792d1604a47c57edac172dec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://771dcee276aac3571cff4769668348ea4a10d51e792d1604a47c57edac172dec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T12:59:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T12:59:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vjwcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c8b3f0e84cfea43942d32d84b09db7348002648527ea7d65a716489f8507eeb5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c8b3f0e84cfea43942d32d84b09db7348002648527ea7d65a716489f8507eeb5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T12:59:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T12:59:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vjwcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1271c0f5d096f2fc134e178807b44e3a5980ca67d64f0e0d73f566b348206a55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1271c0f5d096f2fc134e178807b44e3a5980ca67d64f0e0d73f566b348206a55\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T12:59:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T12:59:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vjwcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T12:59:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-829dv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:34Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:34 crc kubenswrapper[4724]: I1002 12:59:34.721408 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-74k4t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6090eaa-c182-4788-950c-16352c271233\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a3396cd890f4ef8af351a4a3065c2324c20bddc7e079bb78e5a02e9fc8269c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fb6lr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4d5dbca156adf53ca2d3b237e15b6dd4387249e73dbf1a7fb3e7e39c53d1b548\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fb6lr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T12:59:21Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-74k4t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:34Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:34 crc kubenswrapper[4724]: I1002 12:59:34.735232 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df76441bbfa05afa242a77e10e6f855d811a9432b790630a9fa5f359ccb6d457\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:34Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:34 crc kubenswrapper[4724]: I1002 12:59:34.749190 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://16ca55ea7e67fb6219b1c835d960d1f9ec6a18a98789ee961e44a278fad9add7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://72eefb1c1aad15d7ad34aa4384c7d0a20a4c886225cb7d28f0597cf01ad35693\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:34Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:34 crc kubenswrapper[4724]: I1002 12:59:34.765400 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:19Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:34Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:34 crc kubenswrapper[4724]: I1002 12:59:34.782066 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1527f3c8-7fa1-404b-aafd-7cac512df49d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:58:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:58:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:58:56Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:58:56Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:58:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f058a4a1cf35b7f4800f872139322f2b0096d67548945ba03ecfe6775ffd5103\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:58:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c06801afff9aa03ae9e0bcd8a966f454d81fdfe876806eed22e9d93132989dae\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:58:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff4c45bdd946a4d50c7dc93752d3d1a71b10b830611b64e138a5e1c0eb9e5e30\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:58:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a3585f00b9d5a19821986d13b0d46f0db3bdd6eef3b4e28f3a63449e0aa0e55\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a3585f00b9d5a19821986d13b0d46f0db3bdd6eef3b4e28f3a63449e0aa0e55\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-02T12:59:19Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1002 12:59:13.975124 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1002 12:59:13.976423 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1894517521/tls.crt::/tmp/serving-cert-1894517521/tls.key\\\\\\\"\\\\nI1002 12:59:19.332110 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1002 12:59:19.335107 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1002 12:59:19.335132 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1002 12:59:19.335161 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1002 12:59:19.335167 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1002 12:59:19.339404 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1002 12:59:19.339433 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 12:59:19.339437 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 12:59:19.339442 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1002 12:59:19.339446 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1002 12:59:19.339452 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1002 12:59:19.339454 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1002 12:59:19.339646 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1002 12:59:19.342508 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T12:59:02Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://25e74bd1660931f7bb8dc824f985c539abcbd4c706a11edf7192876b8796e751\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:58:59Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://366ed0b9a29ba0b7606888087909b51b39abd60dd6b11e6509b8613913461b6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://366ed0b9a29ba0b7606888087909b51b39abd60dd6b11e6509b8613913461b6b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T12:58:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T12:58:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T12:58:56Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:34Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:34 crc kubenswrapper[4724]: I1002 12:59:34.785905 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 12:59:34 crc kubenswrapper[4724]: I1002 12:59:34.785958 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 12:59:34 crc kubenswrapper[4724]: I1002 12:59:34.785972 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 12:59:34 crc kubenswrapper[4724]: I1002 12:59:34.785996 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 12:59:34 crc kubenswrapper[4724]: I1002 12:59:34.786012 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T12:59:34Z","lastTransitionTime":"2025-10-02T12:59:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 12:59:34 crc kubenswrapper[4724]: I1002 12:59:34.807974 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-w58lt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4089ad23-969c-4222-a8ed-e141ec291e80\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://197d101ac60dc4e22bd91f0bccaf0c5d9461d2bb94425b5c6d4ca75aad0f7e25\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sxqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3940d1762b165816101228f10689c4897ca3909b295c51368935e2d28d32e9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sxqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://300ea92580dcf51b685b378397b0dd067899d424f4394bce09a8242415acba64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sxqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6bc33e00e91c1177fe2762cb5295c78f4b00c1861d44446269e2e7668b25ac83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sxqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f4412d5281a9c03ea5b251d17371e363b9c5a970bad0db12adf3f75ddd1f955\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sxqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4186fbd2d282edbe581a4d8d3616dff86c3e6cdbe797999fa57f6034870e7742\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sxqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b0f2abd22b65316a586899f71ac99b5451e97b96ff0c19ac2aabe1bb9002e7cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1b14319f358f143317aa7440ec2f2b57507f3d05702bbbdba154b5f0bf845475\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-02T12:59:32Z\\\",\\\"message\\\":\\\"04 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI1002 12:59:32.014253 6004 handler.go:208] Removed *v1.Pod event handler 6\\\\nI1002 12:59:32.014591 6004 reflector.go:311] Stopping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI1002 12:59:32.014620 6004 handler.go:208] Removed *v1.Node event handler 2\\\\nI1002 12:59:32.014635 6004 handler.go:208] Removed *v1.Node event handler 7\\\\nI1002 12:59:32.014644 6004 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1002 12:59:32.014607 6004 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1002 12:59:32.014839 6004 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1002 12:59:32.015266 6004 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1002 12:59:32.015328 6004 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1002 12:59:32.015358 6004 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1002 12:59:32.015387 6004 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1002 12:59:32.015392 6004 factory.go:656] Stopping watch factory\\\\nI1002 12:59:32.015416 6004 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1002 12:59:32.015420 6004 ovnkube.go:599] Stopped ovnkube\\\\nI1002 12:59:3\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T12:59:27Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b0f2abd22b65316a586899f71ac99b5451e97b96ff0c19ac2aabe1bb9002e7cf\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-02T12:59:34Z\\\",\\\"message\\\":\\\"nfig-operator_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-machine-config-operator/machine-config-operator\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.4.183\\\\\\\", Port:9001, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI1002 12:59:33.608180 6213 obj_retry.go:386] Retry successful for *v1.Pod openshift-ovn-kubernetes/ovnkube-node-w58lt after 0 failed attempt(s)\\\\nI1002 12:59:33.609167 6213 services_controller.go:452] Built service openshift-machine-config-operator/machine-config-operator per-node LB for network=default: []services.LB{}\\\\nI1002 12:59:33.609171 6213 default_network_controller.go:776] Recording success event on pod openshift-ovn-kubernetes/ovnkube-node-w58lt\\\\nF1002 12:59:33.608123 6213 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, hand\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T12:59:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sxqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c6cd68ea9e76b6dcb1649fafb808953b30389fcaee674b611c655c239a36e7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sxqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9c7841b20007f0436754e669fb6f1a85de1f0ced1c0b2686ae0a20b56ab878f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9c7841b20007f0436754e669fb6f1a85de1f0ced1c0b2686ae0a20b56ab878f2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T12:59:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T12:59:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sxqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T12:59:21Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-w58lt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:34Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:34 crc kubenswrapper[4724]: I1002 12:59:34.825030 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-pr276" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c67e3c27-01fd-47f5-a7fb-a2ae1e7753d4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5fe9713cb9b004911140ef4802a6755a4d2a781fde439d7318ace93e4b608cef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c844q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T12:59:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-pr276\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:34Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:34 crc kubenswrapper[4724]: I1002 12:59:34.839836 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-vv7gr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"58e6c559-83ff-48ec-b337-ddd00852bc3c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://540350e6cb29bb395f31338d08626d3b110cc3adbe0df070045d178281d9c035\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4spmh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T12:59:26Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-vv7gr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:34Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:34 crc kubenswrapper[4724]: I1002 12:59:34.856376 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-z9692" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aeccba6f-b4bb-4dd5-8ab1-798d7a67251a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:33Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:33Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znd9t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znd9t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T12:59:33Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-z9692\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:34Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:34 crc kubenswrapper[4724]: I1002 12:59:34.874867 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8902618f9e9502ff68d0658e199abb68025c551774c41022ed9f3f15dcccba47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:34Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:34 crc kubenswrapper[4724]: I1002 12:59:34.888704 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 12:59:34 crc kubenswrapper[4724]: I1002 12:59:34.888756 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 12:59:34 crc kubenswrapper[4724]: I1002 12:59:34.888770 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 12:59:34 crc kubenswrapper[4724]: I1002 12:59:34.888795 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 12:59:34 crc kubenswrapper[4724]: I1002 12:59:34.888807 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T12:59:34Z","lastTransitionTime":"2025-10-02T12:59:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 12:59:34 crc kubenswrapper[4724]: I1002 12:59:34.893317 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:34Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:34 crc kubenswrapper[4724]: I1002 12:59:34.910497 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:19Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:34Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:34 crc kubenswrapper[4724]: I1002 12:59:34.992479 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 12:59:34 crc kubenswrapper[4724]: I1002 12:59:34.992557 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 12:59:34 crc kubenswrapper[4724]: I1002 12:59:34.992577 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 12:59:34 crc kubenswrapper[4724]: I1002 12:59:34.992599 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 12:59:34 crc kubenswrapper[4724]: I1002 12:59:34.992615 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T12:59:34Z","lastTransitionTime":"2025-10-02T12:59:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 12:59:34 crc kubenswrapper[4724]: I1002 12:59:34.998393 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 12:59:34 crc kubenswrapper[4724]: E1002 12:59:34.998646 4724 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 12:59:50.998620559 +0000 UTC m=+55.453379680 (durationBeforeRetry 16s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 12:59:34 crc kubenswrapper[4724]: I1002 12:59:34.998720 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 12:59:34 crc kubenswrapper[4724]: E1002 12:59:34.998847 4724 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Oct 02 12:59:34 crc kubenswrapper[4724]: E1002 12:59:34.998907 4724 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-02 12:59:50.998896796 +0000 UTC m=+55.453655917 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Oct 02 12:59:35 crc kubenswrapper[4724]: I1002 12:59:35.096145 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 12:59:35 crc kubenswrapper[4724]: I1002 12:59:35.096203 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 12:59:35 crc kubenswrapper[4724]: I1002 12:59:35.096214 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 12:59:35 crc kubenswrapper[4724]: I1002 12:59:35.096233 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 12:59:35 crc kubenswrapper[4724]: I1002 12:59:35.096245 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T12:59:35Z","lastTransitionTime":"2025-10-02T12:59:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 12:59:35 crc kubenswrapper[4724]: I1002 12:59:35.099381 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 12:59:35 crc kubenswrapper[4724]: I1002 12:59:35.099472 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 12:59:35 crc kubenswrapper[4724]: I1002 12:59:35.099501 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 12:59:35 crc kubenswrapper[4724]: E1002 12:59:35.099524 4724 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 02 12:59:35 crc kubenswrapper[4724]: E1002 12:59:35.099616 4724 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-02 12:59:51.099594084 +0000 UTC m=+55.554353205 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 02 12:59:35 crc kubenswrapper[4724]: E1002 12:59:35.099640 4724 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 02 12:59:35 crc kubenswrapper[4724]: E1002 12:59:35.099673 4724 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 02 12:59:35 crc kubenswrapper[4724]: E1002 12:59:35.099690 4724 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 02 12:59:35 crc kubenswrapper[4724]: E1002 12:59:35.099728 4724 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-10-02 12:59:51.099717657 +0000 UTC m=+55.554476778 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 02 12:59:35 crc kubenswrapper[4724]: E1002 12:59:35.099837 4724 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 02 12:59:35 crc kubenswrapper[4724]: E1002 12:59:35.099888 4724 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 02 12:59:35 crc kubenswrapper[4724]: E1002 12:59:35.099911 4724 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 02 12:59:35 crc kubenswrapper[4724]: E1002 12:59:35.099985 4724 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-10-02 12:59:51.099965313 +0000 UTC m=+55.554724434 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 02 12:59:35 crc kubenswrapper[4724]: I1002 12:59:35.198729 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 12:59:35 crc kubenswrapper[4724]: I1002 12:59:35.198772 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 12:59:35 crc kubenswrapper[4724]: I1002 12:59:35.198785 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 12:59:35 crc kubenswrapper[4724]: I1002 12:59:35.198806 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 12:59:35 crc kubenswrapper[4724]: I1002 12:59:35.198820 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T12:59:35Z","lastTransitionTime":"2025-10-02T12:59:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 12:59:35 crc kubenswrapper[4724]: I1002 12:59:35.301901 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 12:59:35 crc kubenswrapper[4724]: I1002 12:59:35.301951 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 12:59:35 crc kubenswrapper[4724]: I1002 12:59:35.301959 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 12:59:35 crc kubenswrapper[4724]: I1002 12:59:35.301980 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 12:59:35 crc kubenswrapper[4724]: I1002 12:59:35.301995 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T12:59:35Z","lastTransitionTime":"2025-10-02T12:59:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 12:59:35 crc kubenswrapper[4724]: I1002 12:59:35.313296 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 12:59:35 crc kubenswrapper[4724]: I1002 12:59:35.313364 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 12:59:35 crc kubenswrapper[4724]: I1002 12:59:35.313445 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 12:59:35 crc kubenswrapper[4724]: E1002 12:59:35.313486 4724 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 02 12:59:35 crc kubenswrapper[4724]: E1002 12:59:35.313564 4724 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 02 12:59:35 crc kubenswrapper[4724]: E1002 12:59:35.313652 4724 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 02 12:59:35 crc kubenswrapper[4724]: I1002 12:59:35.365342 4724 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/network-metrics-daemon-q7t2t"] Oct 02 12:59:35 crc kubenswrapper[4724]: I1002 12:59:35.366043 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-q7t2t" Oct 02 12:59:35 crc kubenswrapper[4724]: E1002 12:59:35.366137 4724 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-q7t2t" podUID="32e04071-6b34-4fc0-9783-f346a72fcf99" Oct 02 12:59:35 crc kubenswrapper[4724]: I1002 12:59:35.381857 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"92a0e520-2ed8-4d81-8de4-c0f98916b012\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:58:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:58:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:58:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8d0d24fc1f56963b40bca48e3e923a5b35f9bec48f0b4fabbbdc9163762b2aa6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:58:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cdf23aa5cabd12ba0af299a7184df0341b5a7f7fb6f8159a10c932599fcef08b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:58:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a087f1c52a561564ac405ef652e8cb1542077e507acff22e1189f7370b7ca322\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:58:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a675aca2218d07044b6e9a391f2b60f5cc822790011af97b0c0b704c1bd98d78\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:58:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T12:58:56Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:35Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:35 crc kubenswrapper[4724]: I1002 12:59:35.397256 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-2mrjk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e0d209f5-ad55-48f5-b4de-51aa5a972c19\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e94572c741d70917793b2482faf4c7fba57dfc4a29ade8824b35109735b602c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8f48\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T12:59:21Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-2mrjk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:35Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:35 crc kubenswrapper[4724]: I1002 12:59:35.401456 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/32e04071-6b34-4fc0-9783-f346a72fcf99-metrics-certs\") pod \"network-metrics-daemon-q7t2t\" (UID: \"32e04071-6b34-4fc0-9783-f346a72fcf99\") " pod="openshift-multus/network-metrics-daemon-q7t2t" Oct 02 12:59:35 crc kubenswrapper[4724]: I1002 12:59:35.401528 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jpmhc\" (UniqueName: \"kubernetes.io/projected/32e04071-6b34-4fc0-9783-f346a72fcf99-kube-api-access-jpmhc\") pod \"network-metrics-daemon-q7t2t\" (UID: \"32e04071-6b34-4fc0-9783-f346a72fcf99\") " pod="openshift-multus/network-metrics-daemon-q7t2t" Oct 02 12:59:35 crc kubenswrapper[4724]: I1002 12:59:35.404453 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 12:59:35 crc kubenswrapper[4724]: I1002 12:59:35.404497 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 12:59:35 crc kubenswrapper[4724]: I1002 12:59:35.404510 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 12:59:35 crc kubenswrapper[4724]: I1002 12:59:35.404528 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 12:59:35 crc kubenswrapper[4724]: I1002 12:59:35.404563 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T12:59:35Z","lastTransitionTime":"2025-10-02T12:59:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 12:59:35 crc kubenswrapper[4724]: I1002 12:59:35.414141 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-74k4t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6090eaa-c182-4788-950c-16352c271233\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a3396cd890f4ef8af351a4a3065c2324c20bddc7e079bb78e5a02e9fc8269c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fb6lr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4d5dbca156adf53ca2d3b237e15b6dd4387249e73dbf1a7fb3e7e39c53d1b548\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fb6lr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T12:59:21Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-74k4t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:35Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:35 crc kubenswrapper[4724]: I1002 12:59:35.428957 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-q7t2t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"32e04071-6b34-4fc0-9783-f346a72fcf99\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpmhc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpmhc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T12:59:35Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-q7t2t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:35Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:35 crc kubenswrapper[4724]: I1002 12:59:35.446320 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df76441bbfa05afa242a77e10e6f855d811a9432b790630a9fa5f359ccb6d457\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:35Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:35 crc kubenswrapper[4724]: I1002 12:59:35.465586 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://16ca55ea7e67fb6219b1c835d960d1f9ec6a18a98789ee961e44a278fad9add7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://72eefb1c1aad15d7ad34aa4384c7d0a20a4c886225cb7d28f0597cf01ad35693\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:35Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:35 crc kubenswrapper[4724]: I1002 12:59:35.481472 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:19Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:35Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:35 crc kubenswrapper[4724]: I1002 12:59:35.498393 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-829dv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b90b9e02-3565-4ad5-8f8c-eec339fc499c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://12d1d4558bef1a0a974bf27f9547c646e216ed5001227ab980e89b05c2a7b5ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vjwcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://55a18158dd67ef4667e4a6bc9684c4f4824f848b6e151a0f99c73e5ecfcc5ab1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://55a18158dd67ef4667e4a6bc9684c4f4824f848b6e151a0f99c73e5ecfcc5ab1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T12:59:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T12:59:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vjwcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://db711940b17fde8f8a00b44de8366688eb730facc06a174bf09456c12c673eb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://db711940b17fde8f8a00b44de8366688eb730facc06a174bf09456c12c673eb0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T12:59:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T12:59:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vjwcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a04ebb8ce267454f3068833092d9644e22484d6d3ea6c89e20fe396e5ea1fe9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a04ebb8ce267454f3068833092d9644e22484d6d3ea6c89e20fe396e5ea1fe9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T12:59:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T12:59:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vjwcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://771dcee276aac3571cff4769668348ea4a10d51e792d1604a47c57edac172dec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://771dcee276aac3571cff4769668348ea4a10d51e792d1604a47c57edac172dec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T12:59:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T12:59:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vjwcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c8b3f0e84cfea43942d32d84b09db7348002648527ea7d65a716489f8507eeb5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c8b3f0e84cfea43942d32d84b09db7348002648527ea7d65a716489f8507eeb5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T12:59:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T12:59:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vjwcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1271c0f5d096f2fc134e178807b44e3a5980ca67d64f0e0d73f566b348206a55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1271c0f5d096f2fc134e178807b44e3a5980ca67d64f0e0d73f566b348206a55\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T12:59:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T12:59:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vjwcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T12:59:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-829dv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:35Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:35 crc kubenswrapper[4724]: I1002 12:59:35.502191 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/32e04071-6b34-4fc0-9783-f346a72fcf99-metrics-certs\") pod \"network-metrics-daemon-q7t2t\" (UID: \"32e04071-6b34-4fc0-9783-f346a72fcf99\") " pod="openshift-multus/network-metrics-daemon-q7t2t" Oct 02 12:59:35 crc kubenswrapper[4724]: I1002 12:59:35.502254 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jpmhc\" (UniqueName: \"kubernetes.io/projected/32e04071-6b34-4fc0-9783-f346a72fcf99-kube-api-access-jpmhc\") pod \"network-metrics-daemon-q7t2t\" (UID: \"32e04071-6b34-4fc0-9783-f346a72fcf99\") " pod="openshift-multus/network-metrics-daemon-q7t2t" Oct 02 12:59:35 crc kubenswrapper[4724]: E1002 12:59:35.502477 4724 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Oct 02 12:59:35 crc kubenswrapper[4724]: E1002 12:59:35.502653 4724 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/32e04071-6b34-4fc0-9783-f346a72fcf99-metrics-certs podName:32e04071-6b34-4fc0-9783-f346a72fcf99 nodeName:}" failed. No retries permitted until 2025-10-02 12:59:36.002625523 +0000 UTC m=+40.457384644 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/32e04071-6b34-4fc0-9783-f346a72fcf99-metrics-certs") pod "network-metrics-daemon-q7t2t" (UID: "32e04071-6b34-4fc0-9783-f346a72fcf99") : object "openshift-multus"/"metrics-daemon-secret" not registered Oct 02 12:59:35 crc kubenswrapper[4724]: I1002 12:59:35.508321 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 12:59:35 crc kubenswrapper[4724]: I1002 12:59:35.508389 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 12:59:35 crc kubenswrapper[4724]: I1002 12:59:35.508405 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 12:59:35 crc kubenswrapper[4724]: I1002 12:59:35.508425 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 12:59:35 crc kubenswrapper[4724]: I1002 12:59:35.508436 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T12:59:35Z","lastTransitionTime":"2025-10-02T12:59:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 12:59:35 crc kubenswrapper[4724]: I1002 12:59:35.518625 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1527f3c8-7fa1-404b-aafd-7cac512df49d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:58:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:58:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:58:56Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:58:56Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:58:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f058a4a1cf35b7f4800f872139322f2b0096d67548945ba03ecfe6775ffd5103\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:58:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c06801afff9aa03ae9e0bcd8a966f454d81fdfe876806eed22e9d93132989dae\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:58:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff4c45bdd946a4d50c7dc93752d3d1a71b10b830611b64e138a5e1c0eb9e5e30\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:58:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a3585f00b9d5a19821986d13b0d46f0db3bdd6eef3b4e28f3a63449e0aa0e55\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a3585f00b9d5a19821986d13b0d46f0db3bdd6eef3b4e28f3a63449e0aa0e55\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-02T12:59:19Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1002 12:59:13.975124 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1002 12:59:13.976423 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1894517521/tls.crt::/tmp/serving-cert-1894517521/tls.key\\\\\\\"\\\\nI1002 12:59:19.332110 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1002 12:59:19.335107 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1002 12:59:19.335132 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1002 12:59:19.335161 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1002 12:59:19.335167 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1002 12:59:19.339404 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1002 12:59:19.339433 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 12:59:19.339437 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 12:59:19.339442 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1002 12:59:19.339446 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1002 12:59:19.339452 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1002 12:59:19.339454 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1002 12:59:19.339646 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1002 12:59:19.342508 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T12:59:02Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://25e74bd1660931f7bb8dc824f985c539abcbd4c706a11edf7192876b8796e751\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:58:59Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://366ed0b9a29ba0b7606888087909b51b39abd60dd6b11e6509b8613913461b6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://366ed0b9a29ba0b7606888087909b51b39abd60dd6b11e6509b8613913461b6b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T12:58:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T12:58:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T12:58:56Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:35Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:35 crc kubenswrapper[4724]: I1002 12:59:35.519823 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jpmhc\" (UniqueName: \"kubernetes.io/projected/32e04071-6b34-4fc0-9783-f346a72fcf99-kube-api-access-jpmhc\") pod \"network-metrics-daemon-q7t2t\" (UID: \"32e04071-6b34-4fc0-9783-f346a72fcf99\") " pod="openshift-multus/network-metrics-daemon-q7t2t" Oct 02 12:59:35 crc kubenswrapper[4724]: I1002 12:59:35.543251 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-w58lt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4089ad23-969c-4222-a8ed-e141ec291e80\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://197d101ac60dc4e22bd91f0bccaf0c5d9461d2bb94425b5c6d4ca75aad0f7e25\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sxqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3940d1762b165816101228f10689c4897ca3909b295c51368935e2d28d32e9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sxqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://300ea92580dcf51b685b378397b0dd067899d424f4394bce09a8242415acba64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sxqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6bc33e00e91c1177fe2762cb5295c78f4b00c1861d44446269e2e7668b25ac83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sxqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f4412d5281a9c03ea5b251d17371e363b9c5a970bad0db12adf3f75ddd1f955\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sxqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4186fbd2d282edbe581a4d8d3616dff86c3e6cdbe797999fa57f6034870e7742\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sxqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b0f2abd22b65316a586899f71ac99b5451e97b96ff0c19ac2aabe1bb9002e7cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1b14319f358f143317aa7440ec2f2b57507f3d05702bbbdba154b5f0bf845475\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-02T12:59:32Z\\\",\\\"message\\\":\\\"04 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI1002 12:59:32.014253 6004 handler.go:208] Removed *v1.Pod event handler 6\\\\nI1002 12:59:32.014591 6004 reflector.go:311] Stopping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI1002 12:59:32.014620 6004 handler.go:208] Removed *v1.Node event handler 2\\\\nI1002 12:59:32.014635 6004 handler.go:208] Removed *v1.Node event handler 7\\\\nI1002 12:59:32.014644 6004 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1002 12:59:32.014607 6004 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1002 12:59:32.014839 6004 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1002 12:59:32.015266 6004 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1002 12:59:32.015328 6004 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1002 12:59:32.015358 6004 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1002 12:59:32.015387 6004 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1002 12:59:32.015392 6004 factory.go:656] Stopping watch factory\\\\nI1002 12:59:32.015416 6004 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1002 12:59:32.015420 6004 ovnkube.go:599] Stopped ovnkube\\\\nI1002 12:59:3\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T12:59:27Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b0f2abd22b65316a586899f71ac99b5451e97b96ff0c19ac2aabe1bb9002e7cf\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-02T12:59:34Z\\\",\\\"message\\\":\\\"nfig-operator_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-machine-config-operator/machine-config-operator\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.4.183\\\\\\\", Port:9001, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI1002 12:59:33.608180 6213 obj_retry.go:386] Retry successful for *v1.Pod openshift-ovn-kubernetes/ovnkube-node-w58lt after 0 failed attempt(s)\\\\nI1002 12:59:33.609167 6213 services_controller.go:452] Built service openshift-machine-config-operator/machine-config-operator per-node LB for network=default: []services.LB{}\\\\nI1002 12:59:33.609171 6213 default_network_controller.go:776] Recording success event on pod openshift-ovn-kubernetes/ovnkube-node-w58lt\\\\nF1002 12:59:33.608123 6213 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, hand\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T12:59:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sxqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c6cd68ea9e76b6dcb1649fafb808953b30389fcaee674b611c655c239a36e7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sxqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9c7841b20007f0436754e669fb6f1a85de1f0ced1c0b2686ae0a20b56ab878f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9c7841b20007f0436754e669fb6f1a85de1f0ced1c0b2686ae0a20b56ab878f2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T12:59:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T12:59:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sxqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T12:59:21Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-w58lt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:35Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:35 crc kubenswrapper[4724]: I1002 12:59:35.559143 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-vv7gr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"58e6c559-83ff-48ec-b337-ddd00852bc3c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://540350e6cb29bb395f31338d08626d3b110cc3adbe0df070045d178281d9c035\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4spmh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T12:59:26Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-vv7gr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:35Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:35 crc kubenswrapper[4724]: I1002 12:59:35.574615 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-z9692" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aeccba6f-b4bb-4dd5-8ab1-798d7a67251a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:33Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:33Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znd9t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znd9t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T12:59:33Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-z9692\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:35Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:35 crc kubenswrapper[4724]: I1002 12:59:35.591714 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8902618f9e9502ff68d0658e199abb68025c551774c41022ed9f3f15dcccba47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:35Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:35 crc kubenswrapper[4724]: I1002 12:59:35.607638 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:35Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:35 crc kubenswrapper[4724]: I1002 12:59:35.611804 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 12:59:35 crc kubenswrapper[4724]: I1002 12:59:35.611843 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 12:59:35 crc kubenswrapper[4724]: I1002 12:59:35.611855 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 12:59:35 crc kubenswrapper[4724]: I1002 12:59:35.611875 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 12:59:35 crc kubenswrapper[4724]: I1002 12:59:35.611890 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T12:59:35Z","lastTransitionTime":"2025-10-02T12:59:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 12:59:35 crc kubenswrapper[4724]: I1002 12:59:35.624008 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:19Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:35Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:35 crc kubenswrapper[4724]: I1002 12:59:35.641220 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-pr276" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c67e3c27-01fd-47f5-a7fb-a2ae1e7753d4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5fe9713cb9b004911140ef4802a6755a4d2a781fde439d7318ace93e4b608cef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c844q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T12:59:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-pr276\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:35Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:35 crc kubenswrapper[4724]: I1002 12:59:35.662039 4724 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-w58lt_4089ad23-969c-4222-a8ed-e141ec291e80/ovnkube-controller/1.log" Oct 02 12:59:35 crc kubenswrapper[4724]: I1002 12:59:35.665745 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-z9692" event={"ID":"aeccba6f-b4bb-4dd5-8ab1-798d7a67251a","Type":"ContainerStarted","Data":"5971fadd3a2831fd568f1a0e53cefad9dbb4a9c5ad719b4024ec27381842f92e"} Oct 02 12:59:35 crc kubenswrapper[4724]: I1002 12:59:35.680260 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-2mrjk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e0d209f5-ad55-48f5-b4de-51aa5a972c19\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e94572c741d70917793b2482faf4c7fba57dfc4a29ade8824b35109735b602c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8f48\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T12:59:21Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-2mrjk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:35Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:35 crc kubenswrapper[4724]: I1002 12:59:35.695233 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"92a0e520-2ed8-4d81-8de4-c0f98916b012\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:58:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:58:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:58:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8d0d24fc1f56963b40bca48e3e923a5b35f9bec48f0b4fabbbdc9163762b2aa6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:58:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cdf23aa5cabd12ba0af299a7184df0341b5a7f7fb6f8159a10c932599fcef08b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:58:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a087f1c52a561564ac405ef652e8cb1542077e507acff22e1189f7370b7ca322\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:58:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a675aca2218d07044b6e9a391f2b60f5cc822790011af97b0c0b704c1bd98d78\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:58:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T12:58:56Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:35Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:35 crc kubenswrapper[4724]: I1002 12:59:35.711224 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://16ca55ea7e67fb6219b1c835d960d1f9ec6a18a98789ee961e44a278fad9add7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://72eefb1c1aad15d7ad34aa4384c7d0a20a4c886225cb7d28f0597cf01ad35693\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:35Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:35 crc kubenswrapper[4724]: I1002 12:59:35.715171 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 12:59:35 crc kubenswrapper[4724]: I1002 12:59:35.715236 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 12:59:35 crc kubenswrapper[4724]: I1002 12:59:35.715256 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 12:59:35 crc kubenswrapper[4724]: I1002 12:59:35.715277 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 12:59:35 crc kubenswrapper[4724]: I1002 12:59:35.715291 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T12:59:35Z","lastTransitionTime":"2025-10-02T12:59:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 12:59:35 crc kubenswrapper[4724]: I1002 12:59:35.728931 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:19Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:35Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:35 crc kubenswrapper[4724]: I1002 12:59:35.745728 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-829dv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b90b9e02-3565-4ad5-8f8c-eec339fc499c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://12d1d4558bef1a0a974bf27f9547c646e216ed5001227ab980e89b05c2a7b5ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vjwcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://55a18158dd67ef4667e4a6bc9684c4f4824f848b6e151a0f99c73e5ecfcc5ab1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://55a18158dd67ef4667e4a6bc9684c4f4824f848b6e151a0f99c73e5ecfcc5ab1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T12:59:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T12:59:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vjwcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://db711940b17fde8f8a00b44de8366688eb730facc06a174bf09456c12c673eb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://db711940b17fde8f8a00b44de8366688eb730facc06a174bf09456c12c673eb0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T12:59:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T12:59:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vjwcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a04ebb8ce267454f3068833092d9644e22484d6d3ea6c89e20fe396e5ea1fe9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a04ebb8ce267454f3068833092d9644e22484d6d3ea6c89e20fe396e5ea1fe9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T12:59:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T12:59:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vjwcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://771dcee276aac3571cff4769668348ea4a10d51e792d1604a47c57edac172dec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://771dcee276aac3571cff4769668348ea4a10d51e792d1604a47c57edac172dec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T12:59:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T12:59:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vjwcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c8b3f0e84cfea43942d32d84b09db7348002648527ea7d65a716489f8507eeb5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c8b3f0e84cfea43942d32d84b09db7348002648527ea7d65a716489f8507eeb5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T12:59:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T12:59:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vjwcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1271c0f5d096f2fc134e178807b44e3a5980ca67d64f0e0d73f566b348206a55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1271c0f5d096f2fc134e178807b44e3a5980ca67d64f0e0d73f566b348206a55\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T12:59:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T12:59:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vjwcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T12:59:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-829dv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:35Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:35 crc kubenswrapper[4724]: I1002 12:59:35.760912 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-74k4t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6090eaa-c182-4788-950c-16352c271233\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a3396cd890f4ef8af351a4a3065c2324c20bddc7e079bb78e5a02e9fc8269c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fb6lr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4d5dbca156adf53ca2d3b237e15b6dd4387249e73dbf1a7fb3e7e39c53d1b548\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fb6lr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T12:59:21Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-74k4t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:35Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:35 crc kubenswrapper[4724]: I1002 12:59:35.775876 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-q7t2t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"32e04071-6b34-4fc0-9783-f346a72fcf99\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpmhc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpmhc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T12:59:35Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-q7t2t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:35Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:35 crc kubenswrapper[4724]: I1002 12:59:35.792035 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df76441bbfa05afa242a77e10e6f855d811a9432b790630a9fa5f359ccb6d457\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:35Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:35 crc kubenswrapper[4724]: I1002 12:59:35.809423 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1527f3c8-7fa1-404b-aafd-7cac512df49d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:58:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:58:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:58:56Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:58:56Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:58:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f058a4a1cf35b7f4800f872139322f2b0096d67548945ba03ecfe6775ffd5103\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:58:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c06801afff9aa03ae9e0bcd8a966f454d81fdfe876806eed22e9d93132989dae\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:58:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff4c45bdd946a4d50c7dc93752d3d1a71b10b830611b64e138a5e1c0eb9e5e30\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:58:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a3585f00b9d5a19821986d13b0d46f0db3bdd6eef3b4e28f3a63449e0aa0e55\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a3585f00b9d5a19821986d13b0d46f0db3bdd6eef3b4e28f3a63449e0aa0e55\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-02T12:59:19Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1002 12:59:13.975124 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1002 12:59:13.976423 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1894517521/tls.crt::/tmp/serving-cert-1894517521/tls.key\\\\\\\"\\\\nI1002 12:59:19.332110 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1002 12:59:19.335107 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1002 12:59:19.335132 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1002 12:59:19.335161 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1002 12:59:19.335167 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1002 12:59:19.339404 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1002 12:59:19.339433 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 12:59:19.339437 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 12:59:19.339442 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1002 12:59:19.339446 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1002 12:59:19.339452 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1002 12:59:19.339454 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1002 12:59:19.339646 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1002 12:59:19.342508 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T12:59:02Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://25e74bd1660931f7bb8dc824f985c539abcbd4c706a11edf7192876b8796e751\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:58:59Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://366ed0b9a29ba0b7606888087909b51b39abd60dd6b11e6509b8613913461b6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://366ed0b9a29ba0b7606888087909b51b39abd60dd6b11e6509b8613913461b6b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T12:58:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T12:58:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T12:58:56Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:35Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:35 crc kubenswrapper[4724]: I1002 12:59:35.818010 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 12:59:35 crc kubenswrapper[4724]: I1002 12:59:35.818059 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 12:59:35 crc kubenswrapper[4724]: I1002 12:59:35.818075 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 12:59:35 crc kubenswrapper[4724]: I1002 12:59:35.818094 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 12:59:35 crc kubenswrapper[4724]: I1002 12:59:35.818105 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T12:59:35Z","lastTransitionTime":"2025-10-02T12:59:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 12:59:35 crc kubenswrapper[4724]: I1002 12:59:35.836713 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-w58lt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4089ad23-969c-4222-a8ed-e141ec291e80\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://197d101ac60dc4e22bd91f0bccaf0c5d9461d2bb94425b5c6d4ca75aad0f7e25\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sxqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3940d1762b165816101228f10689c4897ca3909b295c51368935e2d28d32e9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sxqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://300ea92580dcf51b685b378397b0dd067899d424f4394bce09a8242415acba64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sxqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6bc33e00e91c1177fe2762cb5295c78f4b00c1861d44446269e2e7668b25ac83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sxqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f4412d5281a9c03ea5b251d17371e363b9c5a970bad0db12adf3f75ddd1f955\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sxqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4186fbd2d282edbe581a4d8d3616dff86c3e6cdbe797999fa57f6034870e7742\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sxqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b0f2abd22b65316a586899f71ac99b5451e97b96ff0c19ac2aabe1bb9002e7cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1b14319f358f143317aa7440ec2f2b57507f3d05702bbbdba154b5f0bf845475\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-02T12:59:32Z\\\",\\\"message\\\":\\\"04 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI1002 12:59:32.014253 6004 handler.go:208] Removed *v1.Pod event handler 6\\\\nI1002 12:59:32.014591 6004 reflector.go:311] Stopping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI1002 12:59:32.014620 6004 handler.go:208] Removed *v1.Node event handler 2\\\\nI1002 12:59:32.014635 6004 handler.go:208] Removed *v1.Node event handler 7\\\\nI1002 12:59:32.014644 6004 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1002 12:59:32.014607 6004 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1002 12:59:32.014839 6004 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1002 12:59:32.015266 6004 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1002 12:59:32.015328 6004 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1002 12:59:32.015358 6004 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1002 12:59:32.015387 6004 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1002 12:59:32.015392 6004 factory.go:656] Stopping watch factory\\\\nI1002 12:59:32.015416 6004 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1002 12:59:32.015420 6004 ovnkube.go:599] Stopped ovnkube\\\\nI1002 12:59:3\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T12:59:27Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b0f2abd22b65316a586899f71ac99b5451e97b96ff0c19ac2aabe1bb9002e7cf\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-02T12:59:34Z\\\",\\\"message\\\":\\\"nfig-operator_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-machine-config-operator/machine-config-operator\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.4.183\\\\\\\", Port:9001, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI1002 12:59:33.608180 6213 obj_retry.go:386] Retry successful for *v1.Pod openshift-ovn-kubernetes/ovnkube-node-w58lt after 0 failed attempt(s)\\\\nI1002 12:59:33.609167 6213 services_controller.go:452] Built service openshift-machine-config-operator/machine-config-operator per-node LB for network=default: []services.LB{}\\\\nI1002 12:59:33.609171 6213 default_network_controller.go:776] Recording success event on pod openshift-ovn-kubernetes/ovnkube-node-w58lt\\\\nF1002 12:59:33.608123 6213 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, hand\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T12:59:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sxqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c6cd68ea9e76b6dcb1649fafb808953b30389fcaee674b611c655c239a36e7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sxqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9c7841b20007f0436754e669fb6f1a85de1f0ced1c0b2686ae0a20b56ab878f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9c7841b20007f0436754e669fb6f1a85de1f0ced1c0b2686ae0a20b56ab878f2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T12:59:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T12:59:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sxqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T12:59:21Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-w58lt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:35Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:35 crc kubenswrapper[4724]: I1002 12:59:35.856299 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:35Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:35 crc kubenswrapper[4724]: I1002 12:59:35.872900 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:19Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:35Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:35 crc kubenswrapper[4724]: I1002 12:59:35.889106 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-pr276" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c67e3c27-01fd-47f5-a7fb-a2ae1e7753d4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5fe9713cb9b004911140ef4802a6755a4d2a781fde439d7318ace93e4b608cef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c844q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T12:59:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-pr276\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:35Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:35 crc kubenswrapper[4724]: I1002 12:59:35.903482 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-vv7gr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"58e6c559-83ff-48ec-b337-ddd00852bc3c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://540350e6cb29bb395f31338d08626d3b110cc3adbe0df070045d178281d9c035\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4spmh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T12:59:26Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-vv7gr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:35Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:35 crc kubenswrapper[4724]: I1002 12:59:35.919271 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-z9692" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aeccba6f-b4bb-4dd5-8ab1-798d7a67251a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://116acf2f64bac5bb00b87c2bf09f16dcec811d4effbacb2ad49e4d2656180b54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znd9t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5971fadd3a2831fd568f1a0e53cefad9dbb4a9c5ad719b4024ec27381842f92e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znd9t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T12:59:33Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-z9692\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:35Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:35 crc kubenswrapper[4724]: I1002 12:59:35.920806 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 12:59:35 crc kubenswrapper[4724]: I1002 12:59:35.920882 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 12:59:35 crc kubenswrapper[4724]: I1002 12:59:35.920897 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 12:59:35 crc kubenswrapper[4724]: I1002 12:59:35.920921 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 12:59:35 crc kubenswrapper[4724]: I1002 12:59:35.920934 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T12:59:35Z","lastTransitionTime":"2025-10-02T12:59:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 12:59:35 crc kubenswrapper[4724]: I1002 12:59:35.936432 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8902618f9e9502ff68d0658e199abb68025c551774c41022ed9f3f15dcccba47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:35Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:36 crc kubenswrapper[4724]: I1002 12:59:36.007357 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/32e04071-6b34-4fc0-9783-f346a72fcf99-metrics-certs\") pod \"network-metrics-daemon-q7t2t\" (UID: \"32e04071-6b34-4fc0-9783-f346a72fcf99\") " pod="openshift-multus/network-metrics-daemon-q7t2t" Oct 02 12:59:36 crc kubenswrapper[4724]: E1002 12:59:36.007615 4724 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Oct 02 12:59:36 crc kubenswrapper[4724]: E1002 12:59:36.007693 4724 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/32e04071-6b34-4fc0-9783-f346a72fcf99-metrics-certs podName:32e04071-6b34-4fc0-9783-f346a72fcf99 nodeName:}" failed. No retries permitted until 2025-10-02 12:59:37.007673404 +0000 UTC m=+41.462432525 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/32e04071-6b34-4fc0-9783-f346a72fcf99-metrics-certs") pod "network-metrics-daemon-q7t2t" (UID: "32e04071-6b34-4fc0-9783-f346a72fcf99") : object "openshift-multus"/"metrics-daemon-secret" not registered Oct 02 12:59:36 crc kubenswrapper[4724]: I1002 12:59:36.023978 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 12:59:36 crc kubenswrapper[4724]: I1002 12:59:36.024033 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 12:59:36 crc kubenswrapper[4724]: I1002 12:59:36.024046 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 12:59:36 crc kubenswrapper[4724]: I1002 12:59:36.024065 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 12:59:36 crc kubenswrapper[4724]: I1002 12:59:36.024077 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T12:59:36Z","lastTransitionTime":"2025-10-02T12:59:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 12:59:36 crc kubenswrapper[4724]: I1002 12:59:36.127054 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 12:59:36 crc kubenswrapper[4724]: I1002 12:59:36.127115 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 12:59:36 crc kubenswrapper[4724]: I1002 12:59:36.127126 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 12:59:36 crc kubenswrapper[4724]: I1002 12:59:36.127147 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 12:59:36 crc kubenswrapper[4724]: I1002 12:59:36.127162 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T12:59:36Z","lastTransitionTime":"2025-10-02T12:59:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 12:59:36 crc kubenswrapper[4724]: I1002 12:59:36.230485 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 12:59:36 crc kubenswrapper[4724]: I1002 12:59:36.230550 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 12:59:36 crc kubenswrapper[4724]: I1002 12:59:36.230564 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 12:59:36 crc kubenswrapper[4724]: I1002 12:59:36.230586 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 12:59:36 crc kubenswrapper[4724]: I1002 12:59:36.230599 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T12:59:36Z","lastTransitionTime":"2025-10-02T12:59:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 12:59:36 crc kubenswrapper[4724]: I1002 12:59:36.329697 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-829dv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b90b9e02-3565-4ad5-8f8c-eec339fc499c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://12d1d4558bef1a0a974bf27f9547c646e216ed5001227ab980e89b05c2a7b5ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vjwcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://55a18158dd67ef4667e4a6bc9684c4f4824f848b6e151a0f99c73e5ecfcc5ab1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://55a18158dd67ef4667e4a6bc9684c4f4824f848b6e151a0f99c73e5ecfcc5ab1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T12:59:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T12:59:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vjwcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://db711940b17fde8f8a00b44de8366688eb730facc06a174bf09456c12c673eb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://db711940b17fde8f8a00b44de8366688eb730facc06a174bf09456c12c673eb0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T12:59:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T12:59:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vjwcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a04ebb8ce267454f3068833092d9644e22484d6d3ea6c89e20fe396e5ea1fe9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a04ebb8ce267454f3068833092d9644e22484d6d3ea6c89e20fe396e5ea1fe9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T12:59:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T12:59:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vjwcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://771dcee276aac3571cff4769668348ea4a10d51e792d1604a47c57edac172dec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://771dcee276aac3571cff4769668348ea4a10d51e792d1604a47c57edac172dec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T12:59:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T12:59:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vjwcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c8b3f0e84cfea43942d32d84b09db7348002648527ea7d65a716489f8507eeb5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c8b3f0e84cfea43942d32d84b09db7348002648527ea7d65a716489f8507eeb5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T12:59:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T12:59:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vjwcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1271c0f5d096f2fc134e178807b44e3a5980ca67d64f0e0d73f566b348206a55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1271c0f5d096f2fc134e178807b44e3a5980ca67d64f0e0d73f566b348206a55\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T12:59:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T12:59:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vjwcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T12:59:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-829dv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:36Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:36 crc kubenswrapper[4724]: I1002 12:59:36.333552 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 12:59:36 crc kubenswrapper[4724]: I1002 12:59:36.333598 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 12:59:36 crc kubenswrapper[4724]: I1002 12:59:36.333610 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 12:59:36 crc kubenswrapper[4724]: I1002 12:59:36.333632 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 12:59:36 crc kubenswrapper[4724]: I1002 12:59:36.333645 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T12:59:36Z","lastTransitionTime":"2025-10-02T12:59:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 12:59:36 crc kubenswrapper[4724]: I1002 12:59:36.340823 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-74k4t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6090eaa-c182-4788-950c-16352c271233\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a3396cd890f4ef8af351a4a3065c2324c20bddc7e079bb78e5a02e9fc8269c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fb6lr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4d5dbca156adf53ca2d3b237e15b6dd4387249e73dbf1a7fb3e7e39c53d1b548\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fb6lr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T12:59:21Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-74k4t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:36Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:36 crc kubenswrapper[4724]: I1002 12:59:36.352633 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-q7t2t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"32e04071-6b34-4fc0-9783-f346a72fcf99\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpmhc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpmhc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T12:59:35Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-q7t2t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:36Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:36 crc kubenswrapper[4724]: I1002 12:59:36.368836 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df76441bbfa05afa242a77e10e6f855d811a9432b790630a9fa5f359ccb6d457\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:36Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:36 crc kubenswrapper[4724]: I1002 12:59:36.384645 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://16ca55ea7e67fb6219b1c835d960d1f9ec6a18a98789ee961e44a278fad9add7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://72eefb1c1aad15d7ad34aa4384c7d0a20a4c886225cb7d28f0597cf01ad35693\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:36Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:36 crc kubenswrapper[4724]: I1002 12:59:36.398761 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 12:59:36 crc kubenswrapper[4724]: I1002 12:59:36.398811 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 12:59:36 crc kubenswrapper[4724]: I1002 12:59:36.398824 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 12:59:36 crc kubenswrapper[4724]: I1002 12:59:36.398842 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 12:59:36 crc kubenswrapper[4724]: I1002 12:59:36.398857 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T12:59:36Z","lastTransitionTime":"2025-10-02T12:59:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 12:59:36 crc kubenswrapper[4724]: I1002 12:59:36.402744 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:19Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:36Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:36 crc kubenswrapper[4724]: E1002 12:59:36.411887 4724 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T12:59:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T12:59:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:36Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T12:59:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T12:59:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:36Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"1650a7ad-7086-4fb1-9f9b-b9368db16424\\\",\\\"systemUUID\\\":\\\"5e560baf-345b-4d65-984c-1cfbf6a74dd2\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:36Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:36 crc kubenswrapper[4724]: I1002 12:59:36.416925 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 12:59:36 crc kubenswrapper[4724]: I1002 12:59:36.416970 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 12:59:36 crc kubenswrapper[4724]: I1002 12:59:36.416981 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 12:59:36 crc kubenswrapper[4724]: I1002 12:59:36.417001 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 12:59:36 crc kubenswrapper[4724]: I1002 12:59:36.417014 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T12:59:36Z","lastTransitionTime":"2025-10-02T12:59:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 12:59:36 crc kubenswrapper[4724]: I1002 12:59:36.428561 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1527f3c8-7fa1-404b-aafd-7cac512df49d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:58:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:58:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:58:56Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:58:56Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:58:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f058a4a1cf35b7f4800f872139322f2b0096d67548945ba03ecfe6775ffd5103\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:58:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c06801afff9aa03ae9e0bcd8a966f454d81fdfe876806eed22e9d93132989dae\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:58:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff4c45bdd946a4d50c7dc93752d3d1a71b10b830611b64e138a5e1c0eb9e5e30\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:58:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a3585f00b9d5a19821986d13b0d46f0db3bdd6eef3b4e28f3a63449e0aa0e55\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a3585f00b9d5a19821986d13b0d46f0db3bdd6eef3b4e28f3a63449e0aa0e55\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-02T12:59:19Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1002 12:59:13.975124 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1002 12:59:13.976423 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1894517521/tls.crt::/tmp/serving-cert-1894517521/tls.key\\\\\\\"\\\\nI1002 12:59:19.332110 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1002 12:59:19.335107 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1002 12:59:19.335132 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1002 12:59:19.335161 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1002 12:59:19.335167 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1002 12:59:19.339404 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1002 12:59:19.339433 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 12:59:19.339437 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 12:59:19.339442 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1002 12:59:19.339446 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1002 12:59:19.339452 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1002 12:59:19.339454 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1002 12:59:19.339646 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1002 12:59:19.342508 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T12:59:02Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://25e74bd1660931f7bb8dc824f985c539abcbd4c706a11edf7192876b8796e751\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:58:59Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://366ed0b9a29ba0b7606888087909b51b39abd60dd6b11e6509b8613913461b6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://366ed0b9a29ba0b7606888087909b51b39abd60dd6b11e6509b8613913461b6b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T12:58:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T12:58:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T12:58:56Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:36Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:36 crc kubenswrapper[4724]: E1002 12:59:36.432721 4724 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T12:59:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T12:59:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:36Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T12:59:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T12:59:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:36Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"1650a7ad-7086-4fb1-9f9b-b9368db16424\\\",\\\"systemUUID\\\":\\\"5e560baf-345b-4d65-984c-1cfbf6a74dd2\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:36Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:36 crc kubenswrapper[4724]: I1002 12:59:36.440027 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 12:59:36 crc kubenswrapper[4724]: I1002 12:59:36.440083 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 12:59:36 crc kubenswrapper[4724]: I1002 12:59:36.440103 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 12:59:36 crc kubenswrapper[4724]: I1002 12:59:36.440129 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 12:59:36 crc kubenswrapper[4724]: I1002 12:59:36.440148 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T12:59:36Z","lastTransitionTime":"2025-10-02T12:59:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 12:59:36 crc kubenswrapper[4724]: I1002 12:59:36.450832 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-w58lt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4089ad23-969c-4222-a8ed-e141ec291e80\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://197d101ac60dc4e22bd91f0bccaf0c5d9461d2bb94425b5c6d4ca75aad0f7e25\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sxqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3940d1762b165816101228f10689c4897ca3909b295c51368935e2d28d32e9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sxqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://300ea92580dcf51b685b378397b0dd067899d424f4394bce09a8242415acba64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sxqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6bc33e00e91c1177fe2762cb5295c78f4b00c1861d44446269e2e7668b25ac83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sxqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f4412d5281a9c03ea5b251d17371e363b9c5a970bad0db12adf3f75ddd1f955\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sxqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4186fbd2d282edbe581a4d8d3616dff86c3e6cdbe797999fa57f6034870e7742\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sxqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b0f2abd22b65316a586899f71ac99b5451e97b96ff0c19ac2aabe1bb9002e7cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1b14319f358f143317aa7440ec2f2b57507f3d05702bbbdba154b5f0bf845475\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-02T12:59:32Z\\\",\\\"message\\\":\\\"04 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI1002 12:59:32.014253 6004 handler.go:208] Removed *v1.Pod event handler 6\\\\nI1002 12:59:32.014591 6004 reflector.go:311] Stopping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI1002 12:59:32.014620 6004 handler.go:208] Removed *v1.Node event handler 2\\\\nI1002 12:59:32.014635 6004 handler.go:208] Removed *v1.Node event handler 7\\\\nI1002 12:59:32.014644 6004 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1002 12:59:32.014607 6004 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1002 12:59:32.014839 6004 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1002 12:59:32.015266 6004 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1002 12:59:32.015328 6004 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1002 12:59:32.015358 6004 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1002 12:59:32.015387 6004 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1002 12:59:32.015392 6004 factory.go:656] Stopping watch factory\\\\nI1002 12:59:32.015416 6004 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1002 12:59:32.015420 6004 ovnkube.go:599] Stopped ovnkube\\\\nI1002 12:59:3\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T12:59:27Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b0f2abd22b65316a586899f71ac99b5451e97b96ff0c19ac2aabe1bb9002e7cf\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-02T12:59:34Z\\\",\\\"message\\\":\\\"nfig-operator_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-machine-config-operator/machine-config-operator\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.4.183\\\\\\\", Port:9001, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI1002 12:59:33.608180 6213 obj_retry.go:386] Retry successful for *v1.Pod openshift-ovn-kubernetes/ovnkube-node-w58lt after 0 failed attempt(s)\\\\nI1002 12:59:33.609167 6213 services_controller.go:452] Built service openshift-machine-config-operator/machine-config-operator per-node LB for network=default: []services.LB{}\\\\nI1002 12:59:33.609171 6213 default_network_controller.go:776] Recording success event on pod openshift-ovn-kubernetes/ovnkube-node-w58lt\\\\nF1002 12:59:33.608123 6213 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, hand\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T12:59:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sxqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c6cd68ea9e76b6dcb1649fafb808953b30389fcaee674b611c655c239a36e7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sxqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9c7841b20007f0436754e669fb6f1a85de1f0ced1c0b2686ae0a20b56ab878f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9c7841b20007f0436754e669fb6f1a85de1f0ced1c0b2686ae0a20b56ab878f2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T12:59:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T12:59:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sxqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T12:59:21Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-w58lt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:36Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:36 crc kubenswrapper[4724]: E1002 12:59:36.453454 4724 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T12:59:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T12:59:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:36Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T12:59:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T12:59:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:36Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"1650a7ad-7086-4fb1-9f9b-b9368db16424\\\",\\\"systemUUID\\\":\\\"5e560baf-345b-4d65-984c-1cfbf6a74dd2\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:36Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:36 crc kubenswrapper[4724]: I1002 12:59:36.458261 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 12:59:36 crc kubenswrapper[4724]: I1002 12:59:36.458306 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 12:59:36 crc kubenswrapper[4724]: I1002 12:59:36.458318 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 12:59:36 crc kubenswrapper[4724]: I1002 12:59:36.458337 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 12:59:36 crc kubenswrapper[4724]: I1002 12:59:36.458350 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T12:59:36Z","lastTransitionTime":"2025-10-02T12:59:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 12:59:36 crc kubenswrapper[4724]: I1002 12:59:36.468247 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-pr276" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c67e3c27-01fd-47f5-a7fb-a2ae1e7753d4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5fe9713cb9b004911140ef4802a6755a4d2a781fde439d7318ace93e4b608cef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c844q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T12:59:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-pr276\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:36Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:36 crc kubenswrapper[4724]: E1002 12:59:36.471340 4724 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T12:59:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T12:59:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:36Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T12:59:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T12:59:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:36Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"1650a7ad-7086-4fb1-9f9b-b9368db16424\\\",\\\"systemUUID\\\":\\\"5e560baf-345b-4d65-984c-1cfbf6a74dd2\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:36Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:36 crc kubenswrapper[4724]: I1002 12:59:36.474899 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 12:59:36 crc kubenswrapper[4724]: I1002 12:59:36.474941 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 12:59:36 crc kubenswrapper[4724]: I1002 12:59:36.474953 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 12:59:36 crc kubenswrapper[4724]: I1002 12:59:36.474976 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 12:59:36 crc kubenswrapper[4724]: I1002 12:59:36.474991 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T12:59:36Z","lastTransitionTime":"2025-10-02T12:59:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 12:59:36 crc kubenswrapper[4724]: I1002 12:59:36.481082 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-vv7gr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"58e6c559-83ff-48ec-b337-ddd00852bc3c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://540350e6cb29bb395f31338d08626d3b110cc3adbe0df070045d178281d9c035\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4spmh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T12:59:26Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-vv7gr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:36Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:36 crc kubenswrapper[4724]: E1002 12:59:36.487453 4724 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T12:59:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T12:59:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:36Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T12:59:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T12:59:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:36Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"1650a7ad-7086-4fb1-9f9b-b9368db16424\\\",\\\"systemUUID\\\":\\\"5e560baf-345b-4d65-984c-1cfbf6a74dd2\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:36Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:36 crc kubenswrapper[4724]: E1002 12:59:36.487618 4724 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Oct 02 12:59:36 crc kubenswrapper[4724]: I1002 12:59:36.489518 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 12:59:36 crc kubenswrapper[4724]: I1002 12:59:36.489608 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 12:59:36 crc kubenswrapper[4724]: I1002 12:59:36.489621 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 12:59:36 crc kubenswrapper[4724]: I1002 12:59:36.489645 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 12:59:36 crc kubenswrapper[4724]: I1002 12:59:36.489664 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T12:59:36Z","lastTransitionTime":"2025-10-02T12:59:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 12:59:36 crc kubenswrapper[4724]: I1002 12:59:36.496857 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-z9692" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aeccba6f-b4bb-4dd5-8ab1-798d7a67251a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://116acf2f64bac5bb00b87c2bf09f16dcec811d4effbacb2ad49e4d2656180b54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znd9t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5971fadd3a2831fd568f1a0e53cefad9dbb4a9c5ad719b4024ec27381842f92e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znd9t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T12:59:33Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-z9692\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:36Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:36 crc kubenswrapper[4724]: I1002 12:59:36.514334 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8902618f9e9502ff68d0658e199abb68025c551774c41022ed9f3f15dcccba47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:36Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:36 crc kubenswrapper[4724]: I1002 12:59:36.529312 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:36Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:36 crc kubenswrapper[4724]: I1002 12:59:36.545602 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:19Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:36Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:36 crc kubenswrapper[4724]: I1002 12:59:36.561133 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"92a0e520-2ed8-4d81-8de4-c0f98916b012\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:58:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:58:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:58:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8d0d24fc1f56963b40bca48e3e923a5b35f9bec48f0b4fabbbdc9163762b2aa6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:58:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cdf23aa5cabd12ba0af299a7184df0341b5a7f7fb6f8159a10c932599fcef08b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:58:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a087f1c52a561564ac405ef652e8cb1542077e507acff22e1189f7370b7ca322\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:58:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a675aca2218d07044b6e9a391f2b60f5cc822790011af97b0c0b704c1bd98d78\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:58:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T12:58:56Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:36Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:36 crc kubenswrapper[4724]: I1002 12:59:36.574042 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-2mrjk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e0d209f5-ad55-48f5-b4de-51aa5a972c19\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e94572c741d70917793b2482faf4c7fba57dfc4a29ade8824b35109735b602c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8f48\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T12:59:21Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-2mrjk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:36Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:36 crc kubenswrapper[4724]: I1002 12:59:36.592234 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 12:59:36 crc kubenswrapper[4724]: I1002 12:59:36.592283 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 12:59:36 crc kubenswrapper[4724]: I1002 12:59:36.592294 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 12:59:36 crc kubenswrapper[4724]: I1002 12:59:36.592314 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 12:59:36 crc kubenswrapper[4724]: I1002 12:59:36.592328 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T12:59:36Z","lastTransitionTime":"2025-10-02T12:59:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 12:59:36 crc kubenswrapper[4724]: I1002 12:59:36.695395 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 12:59:36 crc kubenswrapper[4724]: I1002 12:59:36.695449 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 12:59:36 crc kubenswrapper[4724]: I1002 12:59:36.695463 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 12:59:36 crc kubenswrapper[4724]: I1002 12:59:36.695481 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 12:59:36 crc kubenswrapper[4724]: I1002 12:59:36.695495 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T12:59:36Z","lastTransitionTime":"2025-10-02T12:59:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 12:59:36 crc kubenswrapper[4724]: I1002 12:59:36.798670 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 12:59:36 crc kubenswrapper[4724]: I1002 12:59:36.798717 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 12:59:36 crc kubenswrapper[4724]: I1002 12:59:36.798728 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 12:59:36 crc kubenswrapper[4724]: I1002 12:59:36.798748 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 12:59:36 crc kubenswrapper[4724]: I1002 12:59:36.798762 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T12:59:36Z","lastTransitionTime":"2025-10-02T12:59:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 12:59:36 crc kubenswrapper[4724]: I1002 12:59:36.901913 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 12:59:36 crc kubenswrapper[4724]: I1002 12:59:36.901958 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 12:59:36 crc kubenswrapper[4724]: I1002 12:59:36.901971 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 12:59:36 crc kubenswrapper[4724]: I1002 12:59:36.901990 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 12:59:36 crc kubenswrapper[4724]: I1002 12:59:36.902004 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T12:59:36Z","lastTransitionTime":"2025-10-02T12:59:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 12:59:37 crc kubenswrapper[4724]: I1002 12:59:37.006048 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 12:59:37 crc kubenswrapper[4724]: I1002 12:59:37.006502 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 12:59:37 crc kubenswrapper[4724]: I1002 12:59:37.006514 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 12:59:37 crc kubenswrapper[4724]: I1002 12:59:37.006530 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 12:59:37 crc kubenswrapper[4724]: I1002 12:59:37.006569 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T12:59:37Z","lastTransitionTime":"2025-10-02T12:59:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 12:59:37 crc kubenswrapper[4724]: I1002 12:59:37.018925 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/32e04071-6b34-4fc0-9783-f346a72fcf99-metrics-certs\") pod \"network-metrics-daemon-q7t2t\" (UID: \"32e04071-6b34-4fc0-9783-f346a72fcf99\") " pod="openshift-multus/network-metrics-daemon-q7t2t" Oct 02 12:59:37 crc kubenswrapper[4724]: E1002 12:59:37.019222 4724 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Oct 02 12:59:37 crc kubenswrapper[4724]: E1002 12:59:37.019359 4724 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/32e04071-6b34-4fc0-9783-f346a72fcf99-metrics-certs podName:32e04071-6b34-4fc0-9783-f346a72fcf99 nodeName:}" failed. No retries permitted until 2025-10-02 12:59:39.019329345 +0000 UTC m=+43.474088466 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/32e04071-6b34-4fc0-9783-f346a72fcf99-metrics-certs") pod "network-metrics-daemon-q7t2t" (UID: "32e04071-6b34-4fc0-9783-f346a72fcf99") : object "openshift-multus"/"metrics-daemon-secret" not registered Oct 02 12:59:37 crc kubenswrapper[4724]: I1002 12:59:37.108977 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 12:59:37 crc kubenswrapper[4724]: I1002 12:59:37.109028 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 12:59:37 crc kubenswrapper[4724]: I1002 12:59:37.109038 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 12:59:37 crc kubenswrapper[4724]: I1002 12:59:37.109057 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 12:59:37 crc kubenswrapper[4724]: I1002 12:59:37.109069 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T12:59:37Z","lastTransitionTime":"2025-10-02T12:59:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 12:59:37 crc kubenswrapper[4724]: I1002 12:59:37.211973 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 12:59:37 crc kubenswrapper[4724]: I1002 12:59:37.212017 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 12:59:37 crc kubenswrapper[4724]: I1002 12:59:37.212026 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 12:59:37 crc kubenswrapper[4724]: I1002 12:59:37.212043 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 12:59:37 crc kubenswrapper[4724]: I1002 12:59:37.212055 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T12:59:37Z","lastTransitionTime":"2025-10-02T12:59:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 12:59:37 crc kubenswrapper[4724]: I1002 12:59:37.312752 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 12:59:37 crc kubenswrapper[4724]: I1002 12:59:37.312835 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-q7t2t" Oct 02 12:59:37 crc kubenswrapper[4724]: I1002 12:59:37.312764 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 12:59:37 crc kubenswrapper[4724]: I1002 12:59:37.312845 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 12:59:37 crc kubenswrapper[4724]: E1002 12:59:37.313006 4724 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-q7t2t" podUID="32e04071-6b34-4fc0-9783-f346a72fcf99" Oct 02 12:59:37 crc kubenswrapper[4724]: E1002 12:59:37.312930 4724 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 02 12:59:37 crc kubenswrapper[4724]: E1002 12:59:37.313261 4724 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 02 12:59:37 crc kubenswrapper[4724]: E1002 12:59:37.313396 4724 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 02 12:59:37 crc kubenswrapper[4724]: I1002 12:59:37.314572 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 12:59:37 crc kubenswrapper[4724]: I1002 12:59:37.314627 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 12:59:37 crc kubenswrapper[4724]: I1002 12:59:37.314638 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 12:59:37 crc kubenswrapper[4724]: I1002 12:59:37.314660 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 12:59:37 crc kubenswrapper[4724]: I1002 12:59:37.314678 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T12:59:37Z","lastTransitionTime":"2025-10-02T12:59:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 12:59:37 crc kubenswrapper[4724]: I1002 12:59:37.417901 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 12:59:37 crc kubenswrapper[4724]: I1002 12:59:37.417965 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 12:59:37 crc kubenswrapper[4724]: I1002 12:59:37.417978 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 12:59:37 crc kubenswrapper[4724]: I1002 12:59:37.418001 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 12:59:37 crc kubenswrapper[4724]: I1002 12:59:37.418016 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T12:59:37Z","lastTransitionTime":"2025-10-02T12:59:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 12:59:37 crc kubenswrapper[4724]: I1002 12:59:37.521173 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 12:59:37 crc kubenswrapper[4724]: I1002 12:59:37.521232 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 12:59:37 crc kubenswrapper[4724]: I1002 12:59:37.521252 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 12:59:37 crc kubenswrapper[4724]: I1002 12:59:37.521278 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 12:59:37 crc kubenswrapper[4724]: I1002 12:59:37.521293 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T12:59:37Z","lastTransitionTime":"2025-10-02T12:59:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 12:59:37 crc kubenswrapper[4724]: I1002 12:59:37.623604 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 12:59:37 crc kubenswrapper[4724]: I1002 12:59:37.623648 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 12:59:37 crc kubenswrapper[4724]: I1002 12:59:37.623660 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 12:59:37 crc kubenswrapper[4724]: I1002 12:59:37.623677 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 12:59:37 crc kubenswrapper[4724]: I1002 12:59:37.623688 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T12:59:37Z","lastTransitionTime":"2025-10-02T12:59:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 12:59:37 crc kubenswrapper[4724]: I1002 12:59:37.725840 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 12:59:37 crc kubenswrapper[4724]: I1002 12:59:37.725883 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 12:59:37 crc kubenswrapper[4724]: I1002 12:59:37.725893 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 12:59:37 crc kubenswrapper[4724]: I1002 12:59:37.725908 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 12:59:37 crc kubenswrapper[4724]: I1002 12:59:37.725920 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T12:59:37Z","lastTransitionTime":"2025-10-02T12:59:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 12:59:37 crc kubenswrapper[4724]: I1002 12:59:37.828019 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 12:59:37 crc kubenswrapper[4724]: I1002 12:59:37.828063 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 12:59:37 crc kubenswrapper[4724]: I1002 12:59:37.828072 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 12:59:37 crc kubenswrapper[4724]: I1002 12:59:37.828087 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 12:59:37 crc kubenswrapper[4724]: I1002 12:59:37.828097 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T12:59:37Z","lastTransitionTime":"2025-10-02T12:59:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 12:59:37 crc kubenswrapper[4724]: I1002 12:59:37.930625 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 12:59:37 crc kubenswrapper[4724]: I1002 12:59:37.930666 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 12:59:37 crc kubenswrapper[4724]: I1002 12:59:37.930676 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 12:59:37 crc kubenswrapper[4724]: I1002 12:59:37.930692 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 12:59:37 crc kubenswrapper[4724]: I1002 12:59:37.930703 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T12:59:37Z","lastTransitionTime":"2025-10-02T12:59:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 12:59:38 crc kubenswrapper[4724]: I1002 12:59:38.033576 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 12:59:38 crc kubenswrapper[4724]: I1002 12:59:38.033632 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 12:59:38 crc kubenswrapper[4724]: I1002 12:59:38.033640 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 12:59:38 crc kubenswrapper[4724]: I1002 12:59:38.033661 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 12:59:38 crc kubenswrapper[4724]: I1002 12:59:38.033672 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T12:59:38Z","lastTransitionTime":"2025-10-02T12:59:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 12:59:38 crc kubenswrapper[4724]: I1002 12:59:38.136591 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 12:59:38 crc kubenswrapper[4724]: I1002 12:59:38.136647 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 12:59:38 crc kubenswrapper[4724]: I1002 12:59:38.136657 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 12:59:38 crc kubenswrapper[4724]: I1002 12:59:38.136678 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 12:59:38 crc kubenswrapper[4724]: I1002 12:59:38.136691 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T12:59:38Z","lastTransitionTime":"2025-10-02T12:59:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 12:59:38 crc kubenswrapper[4724]: I1002 12:59:38.238999 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 12:59:38 crc kubenswrapper[4724]: I1002 12:59:38.239067 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 12:59:38 crc kubenswrapper[4724]: I1002 12:59:38.239075 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 12:59:38 crc kubenswrapper[4724]: I1002 12:59:38.239098 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 12:59:38 crc kubenswrapper[4724]: I1002 12:59:38.239109 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T12:59:38Z","lastTransitionTime":"2025-10-02T12:59:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 12:59:38 crc kubenswrapper[4724]: I1002 12:59:38.342133 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 12:59:38 crc kubenswrapper[4724]: I1002 12:59:38.342175 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 12:59:38 crc kubenswrapper[4724]: I1002 12:59:38.342185 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 12:59:38 crc kubenswrapper[4724]: I1002 12:59:38.342202 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 12:59:38 crc kubenswrapper[4724]: I1002 12:59:38.342214 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T12:59:38Z","lastTransitionTime":"2025-10-02T12:59:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 12:59:38 crc kubenswrapper[4724]: I1002 12:59:38.445068 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 12:59:38 crc kubenswrapper[4724]: I1002 12:59:38.445138 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 12:59:38 crc kubenswrapper[4724]: I1002 12:59:38.445152 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 12:59:38 crc kubenswrapper[4724]: I1002 12:59:38.445171 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 12:59:38 crc kubenswrapper[4724]: I1002 12:59:38.445182 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T12:59:38Z","lastTransitionTime":"2025-10-02T12:59:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 12:59:38 crc kubenswrapper[4724]: I1002 12:59:38.547953 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 12:59:38 crc kubenswrapper[4724]: I1002 12:59:38.548044 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 12:59:38 crc kubenswrapper[4724]: I1002 12:59:38.548064 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 12:59:38 crc kubenswrapper[4724]: I1002 12:59:38.548094 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 12:59:38 crc kubenswrapper[4724]: I1002 12:59:38.548121 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T12:59:38Z","lastTransitionTime":"2025-10-02T12:59:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 12:59:38 crc kubenswrapper[4724]: I1002 12:59:38.651039 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 12:59:38 crc kubenswrapper[4724]: I1002 12:59:38.651098 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 12:59:38 crc kubenswrapper[4724]: I1002 12:59:38.651118 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 12:59:38 crc kubenswrapper[4724]: I1002 12:59:38.651144 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 12:59:38 crc kubenswrapper[4724]: I1002 12:59:38.651161 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T12:59:38Z","lastTransitionTime":"2025-10-02T12:59:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 12:59:38 crc kubenswrapper[4724]: I1002 12:59:38.754123 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 12:59:38 crc kubenswrapper[4724]: I1002 12:59:38.754215 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 12:59:38 crc kubenswrapper[4724]: I1002 12:59:38.754232 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 12:59:38 crc kubenswrapper[4724]: I1002 12:59:38.754258 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 12:59:38 crc kubenswrapper[4724]: I1002 12:59:38.754274 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T12:59:38Z","lastTransitionTime":"2025-10-02T12:59:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 12:59:38 crc kubenswrapper[4724]: I1002 12:59:38.857033 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 12:59:38 crc kubenswrapper[4724]: I1002 12:59:38.857132 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 12:59:38 crc kubenswrapper[4724]: I1002 12:59:38.857150 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 12:59:38 crc kubenswrapper[4724]: I1002 12:59:38.857171 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 12:59:38 crc kubenswrapper[4724]: I1002 12:59:38.857185 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T12:59:38Z","lastTransitionTime":"2025-10-02T12:59:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 12:59:38 crc kubenswrapper[4724]: I1002 12:59:38.960603 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 12:59:38 crc kubenswrapper[4724]: I1002 12:59:38.960668 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 12:59:38 crc kubenswrapper[4724]: I1002 12:59:38.960690 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 12:59:38 crc kubenswrapper[4724]: I1002 12:59:38.960729 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 12:59:38 crc kubenswrapper[4724]: I1002 12:59:38.960750 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T12:59:38Z","lastTransitionTime":"2025-10-02T12:59:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 12:59:39 crc kubenswrapper[4724]: I1002 12:59:39.039482 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/32e04071-6b34-4fc0-9783-f346a72fcf99-metrics-certs\") pod \"network-metrics-daemon-q7t2t\" (UID: \"32e04071-6b34-4fc0-9783-f346a72fcf99\") " pod="openshift-multus/network-metrics-daemon-q7t2t" Oct 02 12:59:39 crc kubenswrapper[4724]: E1002 12:59:39.039802 4724 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Oct 02 12:59:39 crc kubenswrapper[4724]: E1002 12:59:39.039943 4724 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/32e04071-6b34-4fc0-9783-f346a72fcf99-metrics-certs podName:32e04071-6b34-4fc0-9783-f346a72fcf99 nodeName:}" failed. No retries permitted until 2025-10-02 12:59:43.039901346 +0000 UTC m=+47.494660507 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/32e04071-6b34-4fc0-9783-f346a72fcf99-metrics-certs") pod "network-metrics-daemon-q7t2t" (UID: "32e04071-6b34-4fc0-9783-f346a72fcf99") : object "openshift-multus"/"metrics-daemon-secret" not registered Oct 02 12:59:39 crc kubenswrapper[4724]: I1002 12:59:39.063573 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 12:59:39 crc kubenswrapper[4724]: I1002 12:59:39.063651 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 12:59:39 crc kubenswrapper[4724]: I1002 12:59:39.063670 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 12:59:39 crc kubenswrapper[4724]: I1002 12:59:39.063697 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 12:59:39 crc kubenswrapper[4724]: I1002 12:59:39.063717 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T12:59:39Z","lastTransitionTime":"2025-10-02T12:59:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 12:59:39 crc kubenswrapper[4724]: I1002 12:59:39.166504 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 12:59:39 crc kubenswrapper[4724]: I1002 12:59:39.166580 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 12:59:39 crc kubenswrapper[4724]: I1002 12:59:39.166594 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 12:59:39 crc kubenswrapper[4724]: I1002 12:59:39.166616 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 12:59:39 crc kubenswrapper[4724]: I1002 12:59:39.166632 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T12:59:39Z","lastTransitionTime":"2025-10-02T12:59:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 12:59:39 crc kubenswrapper[4724]: I1002 12:59:39.270127 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 12:59:39 crc kubenswrapper[4724]: I1002 12:59:39.270208 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 12:59:39 crc kubenswrapper[4724]: I1002 12:59:39.270274 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 12:59:39 crc kubenswrapper[4724]: I1002 12:59:39.270296 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 12:59:39 crc kubenswrapper[4724]: I1002 12:59:39.270310 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T12:59:39Z","lastTransitionTime":"2025-10-02T12:59:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 12:59:39 crc kubenswrapper[4724]: I1002 12:59:39.313609 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 12:59:39 crc kubenswrapper[4724]: I1002 12:59:39.313670 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 12:59:39 crc kubenswrapper[4724]: I1002 12:59:39.313645 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-q7t2t" Oct 02 12:59:39 crc kubenswrapper[4724]: I1002 12:59:39.313622 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 12:59:39 crc kubenswrapper[4724]: E1002 12:59:39.313840 4724 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 02 12:59:39 crc kubenswrapper[4724]: E1002 12:59:39.313917 4724 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 02 12:59:39 crc kubenswrapper[4724]: E1002 12:59:39.314025 4724 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 02 12:59:39 crc kubenswrapper[4724]: E1002 12:59:39.314245 4724 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-q7t2t" podUID="32e04071-6b34-4fc0-9783-f346a72fcf99" Oct 02 12:59:39 crc kubenswrapper[4724]: I1002 12:59:39.373428 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 12:59:39 crc kubenswrapper[4724]: I1002 12:59:39.373483 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 12:59:39 crc kubenswrapper[4724]: I1002 12:59:39.373493 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 12:59:39 crc kubenswrapper[4724]: I1002 12:59:39.373511 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 12:59:39 crc kubenswrapper[4724]: I1002 12:59:39.373523 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T12:59:39Z","lastTransitionTime":"2025-10-02T12:59:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 12:59:39 crc kubenswrapper[4724]: I1002 12:59:39.476002 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 12:59:39 crc kubenswrapper[4724]: I1002 12:59:39.476088 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 12:59:39 crc kubenswrapper[4724]: I1002 12:59:39.476104 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 12:59:39 crc kubenswrapper[4724]: I1002 12:59:39.476123 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 12:59:39 crc kubenswrapper[4724]: I1002 12:59:39.476136 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T12:59:39Z","lastTransitionTime":"2025-10-02T12:59:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 12:59:39 crc kubenswrapper[4724]: I1002 12:59:39.584934 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 12:59:39 crc kubenswrapper[4724]: I1002 12:59:39.585007 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 12:59:39 crc kubenswrapper[4724]: I1002 12:59:39.585027 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 12:59:39 crc kubenswrapper[4724]: I1002 12:59:39.585089 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 12:59:39 crc kubenswrapper[4724]: I1002 12:59:39.585110 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T12:59:39Z","lastTransitionTime":"2025-10-02T12:59:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 12:59:39 crc kubenswrapper[4724]: I1002 12:59:39.688102 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 12:59:39 crc kubenswrapper[4724]: I1002 12:59:39.688152 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 12:59:39 crc kubenswrapper[4724]: I1002 12:59:39.688164 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 12:59:39 crc kubenswrapper[4724]: I1002 12:59:39.688184 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 12:59:39 crc kubenswrapper[4724]: I1002 12:59:39.688196 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T12:59:39Z","lastTransitionTime":"2025-10-02T12:59:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 12:59:39 crc kubenswrapper[4724]: I1002 12:59:39.791189 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 12:59:39 crc kubenswrapper[4724]: I1002 12:59:39.791512 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 12:59:39 crc kubenswrapper[4724]: I1002 12:59:39.791663 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 12:59:39 crc kubenswrapper[4724]: I1002 12:59:39.791772 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 12:59:39 crc kubenswrapper[4724]: I1002 12:59:39.791859 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T12:59:39Z","lastTransitionTime":"2025-10-02T12:59:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 12:59:39 crc kubenswrapper[4724]: I1002 12:59:39.895283 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 12:59:39 crc kubenswrapper[4724]: I1002 12:59:39.895345 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 12:59:39 crc kubenswrapper[4724]: I1002 12:59:39.895361 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 12:59:39 crc kubenswrapper[4724]: I1002 12:59:39.895383 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 12:59:39 crc kubenswrapper[4724]: I1002 12:59:39.895401 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T12:59:39Z","lastTransitionTime":"2025-10-02T12:59:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 12:59:39 crc kubenswrapper[4724]: I1002 12:59:39.997755 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 12:59:39 crc kubenswrapper[4724]: I1002 12:59:39.997816 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 12:59:39 crc kubenswrapper[4724]: I1002 12:59:39.997831 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 12:59:39 crc kubenswrapper[4724]: I1002 12:59:39.997852 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 12:59:39 crc kubenswrapper[4724]: I1002 12:59:39.997867 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T12:59:39Z","lastTransitionTime":"2025-10-02T12:59:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 12:59:40 crc kubenswrapper[4724]: I1002 12:59:40.101360 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 12:59:40 crc kubenswrapper[4724]: I1002 12:59:40.101400 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 12:59:40 crc kubenswrapper[4724]: I1002 12:59:40.101430 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 12:59:40 crc kubenswrapper[4724]: I1002 12:59:40.101450 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 12:59:40 crc kubenswrapper[4724]: I1002 12:59:40.101463 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T12:59:40Z","lastTransitionTime":"2025-10-02T12:59:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 12:59:40 crc kubenswrapper[4724]: I1002 12:59:40.204867 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 12:59:40 crc kubenswrapper[4724]: I1002 12:59:40.205294 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 12:59:40 crc kubenswrapper[4724]: I1002 12:59:40.205372 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 12:59:40 crc kubenswrapper[4724]: I1002 12:59:40.205471 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 12:59:40 crc kubenswrapper[4724]: I1002 12:59:40.205577 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T12:59:40Z","lastTransitionTime":"2025-10-02T12:59:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 12:59:40 crc kubenswrapper[4724]: I1002 12:59:40.306784 4724 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-w58lt" Oct 02 12:59:40 crc kubenswrapper[4724]: I1002 12:59:40.308071 4724 scope.go:117] "RemoveContainer" containerID="b0f2abd22b65316a586899f71ac99b5451e97b96ff0c19ac2aabe1bb9002e7cf" Oct 02 12:59:40 crc kubenswrapper[4724]: E1002 12:59:40.308339 4724 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-w58lt_openshift-ovn-kubernetes(4089ad23-969c-4222-a8ed-e141ec291e80)\"" pod="openshift-ovn-kubernetes/ovnkube-node-w58lt" podUID="4089ad23-969c-4222-a8ed-e141ec291e80" Oct 02 12:59:40 crc kubenswrapper[4724]: I1002 12:59:40.308941 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 12:59:40 crc kubenswrapper[4724]: I1002 12:59:40.308972 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 12:59:40 crc kubenswrapper[4724]: I1002 12:59:40.308982 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 12:59:40 crc kubenswrapper[4724]: I1002 12:59:40.308997 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 12:59:40 crc kubenswrapper[4724]: I1002 12:59:40.309010 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T12:59:40Z","lastTransitionTime":"2025-10-02T12:59:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 12:59:40 crc kubenswrapper[4724]: I1002 12:59:40.321836 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-z9692" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aeccba6f-b4bb-4dd5-8ab1-798d7a67251a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://116acf2f64bac5bb00b87c2bf09f16dcec811d4effbacb2ad49e4d2656180b54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znd9t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5971fadd3a2831fd568f1a0e53cefad9dbb4a9c5ad719b4024ec27381842f92e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znd9t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T12:59:33Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-z9692\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:40Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:40 crc kubenswrapper[4724]: I1002 12:59:40.335692 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8902618f9e9502ff68d0658e199abb68025c551774c41022ed9f3f15dcccba47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:40Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:40 crc kubenswrapper[4724]: I1002 12:59:40.351616 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:40Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:40 crc kubenswrapper[4724]: I1002 12:59:40.367378 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:19Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:40Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:40 crc kubenswrapper[4724]: I1002 12:59:40.381873 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-pr276" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c67e3c27-01fd-47f5-a7fb-a2ae1e7753d4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5fe9713cb9b004911140ef4802a6755a4d2a781fde439d7318ace93e4b608cef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c844q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T12:59:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-pr276\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:40Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:40 crc kubenswrapper[4724]: I1002 12:59:40.393766 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-vv7gr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"58e6c559-83ff-48ec-b337-ddd00852bc3c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://540350e6cb29bb395f31338d08626d3b110cc3adbe0df070045d178281d9c035\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4spmh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T12:59:26Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-vv7gr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:40Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:40 crc kubenswrapper[4724]: I1002 12:59:40.407854 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"92a0e520-2ed8-4d81-8de4-c0f98916b012\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:58:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:58:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:58:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8d0d24fc1f56963b40bca48e3e923a5b35f9bec48f0b4fabbbdc9163762b2aa6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:58:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cdf23aa5cabd12ba0af299a7184df0341b5a7f7fb6f8159a10c932599fcef08b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:58:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a087f1c52a561564ac405ef652e8cb1542077e507acff22e1189f7370b7ca322\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:58:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a675aca2218d07044b6e9a391f2b60f5cc822790011af97b0c0b704c1bd98d78\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:58:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T12:58:56Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:40Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:40 crc kubenswrapper[4724]: I1002 12:59:40.412170 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 12:59:40 crc kubenswrapper[4724]: I1002 12:59:40.412321 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 12:59:40 crc kubenswrapper[4724]: I1002 12:59:40.412383 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 12:59:40 crc kubenswrapper[4724]: I1002 12:59:40.412476 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 12:59:40 crc kubenswrapper[4724]: I1002 12:59:40.412550 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T12:59:40Z","lastTransitionTime":"2025-10-02T12:59:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 12:59:40 crc kubenswrapper[4724]: I1002 12:59:40.421863 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-2mrjk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e0d209f5-ad55-48f5-b4de-51aa5a972c19\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e94572c741d70917793b2482faf4c7fba57dfc4a29ade8824b35109735b602c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8f48\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T12:59:21Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-2mrjk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:40Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:40 crc kubenswrapper[4724]: I1002 12:59:40.433588 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-q7t2t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"32e04071-6b34-4fc0-9783-f346a72fcf99\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpmhc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpmhc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T12:59:35Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-q7t2t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:40Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:40 crc kubenswrapper[4724]: I1002 12:59:40.447691 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df76441bbfa05afa242a77e10e6f855d811a9432b790630a9fa5f359ccb6d457\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:40Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:40 crc kubenswrapper[4724]: I1002 12:59:40.465278 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://16ca55ea7e67fb6219b1c835d960d1f9ec6a18a98789ee961e44a278fad9add7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://72eefb1c1aad15d7ad34aa4384c7d0a20a4c886225cb7d28f0597cf01ad35693\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:40Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:40 crc kubenswrapper[4724]: I1002 12:59:40.480431 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:19Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:40Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:40 crc kubenswrapper[4724]: I1002 12:59:40.497629 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-829dv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b90b9e02-3565-4ad5-8f8c-eec339fc499c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://12d1d4558bef1a0a974bf27f9547c646e216ed5001227ab980e89b05c2a7b5ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vjwcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://55a18158dd67ef4667e4a6bc9684c4f4824f848b6e151a0f99c73e5ecfcc5ab1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://55a18158dd67ef4667e4a6bc9684c4f4824f848b6e151a0f99c73e5ecfcc5ab1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T12:59:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T12:59:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vjwcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://db711940b17fde8f8a00b44de8366688eb730facc06a174bf09456c12c673eb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://db711940b17fde8f8a00b44de8366688eb730facc06a174bf09456c12c673eb0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T12:59:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T12:59:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vjwcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a04ebb8ce267454f3068833092d9644e22484d6d3ea6c89e20fe396e5ea1fe9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a04ebb8ce267454f3068833092d9644e22484d6d3ea6c89e20fe396e5ea1fe9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T12:59:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T12:59:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vjwcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://771dcee276aac3571cff4769668348ea4a10d51e792d1604a47c57edac172dec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://771dcee276aac3571cff4769668348ea4a10d51e792d1604a47c57edac172dec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T12:59:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T12:59:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vjwcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c8b3f0e84cfea43942d32d84b09db7348002648527ea7d65a716489f8507eeb5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c8b3f0e84cfea43942d32d84b09db7348002648527ea7d65a716489f8507eeb5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T12:59:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T12:59:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vjwcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1271c0f5d096f2fc134e178807b44e3a5980ca67d64f0e0d73f566b348206a55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1271c0f5d096f2fc134e178807b44e3a5980ca67d64f0e0d73f566b348206a55\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T12:59:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T12:59:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vjwcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T12:59:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-829dv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:40Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:40 crc kubenswrapper[4724]: I1002 12:59:40.510068 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-74k4t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6090eaa-c182-4788-950c-16352c271233\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a3396cd890f4ef8af351a4a3065c2324c20bddc7e079bb78e5a02e9fc8269c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fb6lr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4d5dbca156adf53ca2d3b237e15b6dd4387249e73dbf1a7fb3e7e39c53d1b548\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fb6lr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T12:59:21Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-74k4t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:40Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:40 crc kubenswrapper[4724]: I1002 12:59:40.514700 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 12:59:40 crc kubenswrapper[4724]: I1002 12:59:40.514739 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 12:59:40 crc kubenswrapper[4724]: I1002 12:59:40.514749 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 12:59:40 crc kubenswrapper[4724]: I1002 12:59:40.514764 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 12:59:40 crc kubenswrapper[4724]: I1002 12:59:40.514774 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T12:59:40Z","lastTransitionTime":"2025-10-02T12:59:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 12:59:40 crc kubenswrapper[4724]: I1002 12:59:40.524969 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1527f3c8-7fa1-404b-aafd-7cac512df49d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:58:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:58:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:58:56Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:58:56Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:58:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f058a4a1cf35b7f4800f872139322f2b0096d67548945ba03ecfe6775ffd5103\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:58:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c06801afff9aa03ae9e0bcd8a966f454d81fdfe876806eed22e9d93132989dae\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:58:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff4c45bdd946a4d50c7dc93752d3d1a71b10b830611b64e138a5e1c0eb9e5e30\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:58:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a3585f00b9d5a19821986d13b0d46f0db3bdd6eef3b4e28f3a63449e0aa0e55\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a3585f00b9d5a19821986d13b0d46f0db3bdd6eef3b4e28f3a63449e0aa0e55\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-02T12:59:19Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1002 12:59:13.975124 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1002 12:59:13.976423 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1894517521/tls.crt::/tmp/serving-cert-1894517521/tls.key\\\\\\\"\\\\nI1002 12:59:19.332110 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1002 12:59:19.335107 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1002 12:59:19.335132 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1002 12:59:19.335161 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1002 12:59:19.335167 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1002 12:59:19.339404 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1002 12:59:19.339433 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 12:59:19.339437 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 12:59:19.339442 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1002 12:59:19.339446 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1002 12:59:19.339452 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1002 12:59:19.339454 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1002 12:59:19.339646 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1002 12:59:19.342508 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T12:59:02Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://25e74bd1660931f7bb8dc824f985c539abcbd4c706a11edf7192876b8796e751\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:58:59Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://366ed0b9a29ba0b7606888087909b51b39abd60dd6b11e6509b8613913461b6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://366ed0b9a29ba0b7606888087909b51b39abd60dd6b11e6509b8613913461b6b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T12:58:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T12:58:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T12:58:56Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:40Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:40 crc kubenswrapper[4724]: I1002 12:59:40.547374 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-w58lt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4089ad23-969c-4222-a8ed-e141ec291e80\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://197d101ac60dc4e22bd91f0bccaf0c5d9461d2bb94425b5c6d4ca75aad0f7e25\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sxqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3940d1762b165816101228f10689c4897ca3909b295c51368935e2d28d32e9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sxqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://300ea92580dcf51b685b378397b0dd067899d424f4394bce09a8242415acba64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sxqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6bc33e00e91c1177fe2762cb5295c78f4b00c1861d44446269e2e7668b25ac83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sxqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f4412d5281a9c03ea5b251d17371e363b9c5a970bad0db12adf3f75ddd1f955\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sxqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4186fbd2d282edbe581a4d8d3616dff86c3e6cdbe797999fa57f6034870e7742\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sxqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b0f2abd22b65316a586899f71ac99b5451e97b96ff0c19ac2aabe1bb9002e7cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b0f2abd22b65316a586899f71ac99b5451e97b96ff0c19ac2aabe1bb9002e7cf\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-02T12:59:34Z\\\",\\\"message\\\":\\\"nfig-operator_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-machine-config-operator/machine-config-operator\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.4.183\\\\\\\", Port:9001, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI1002 12:59:33.608180 6213 obj_retry.go:386] Retry successful for *v1.Pod openshift-ovn-kubernetes/ovnkube-node-w58lt after 0 failed attempt(s)\\\\nI1002 12:59:33.609167 6213 services_controller.go:452] Built service openshift-machine-config-operator/machine-config-operator per-node LB for network=default: []services.LB{}\\\\nI1002 12:59:33.609171 6213 default_network_controller.go:776] Recording success event on pod openshift-ovn-kubernetes/ovnkube-node-w58lt\\\\nF1002 12:59:33.608123 6213 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, hand\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T12:59:32Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-w58lt_openshift-ovn-kubernetes(4089ad23-969c-4222-a8ed-e141ec291e80)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sxqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c6cd68ea9e76b6dcb1649fafb808953b30389fcaee674b611c655c239a36e7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sxqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9c7841b20007f0436754e669fb6f1a85de1f0ced1c0b2686ae0a20b56ab878f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9c7841b20007f0436754e669fb6f1a85de1f0ced1c0b2686ae0a20b56ab878f2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T12:59:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T12:59:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sxqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T12:59:21Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-w58lt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:40Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:40 crc kubenswrapper[4724]: I1002 12:59:40.617956 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 12:59:40 crc kubenswrapper[4724]: I1002 12:59:40.618000 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 12:59:40 crc kubenswrapper[4724]: I1002 12:59:40.618009 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 12:59:40 crc kubenswrapper[4724]: I1002 12:59:40.618027 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 12:59:40 crc kubenswrapper[4724]: I1002 12:59:40.618038 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T12:59:40Z","lastTransitionTime":"2025-10-02T12:59:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 12:59:40 crc kubenswrapper[4724]: I1002 12:59:40.720698 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 12:59:40 crc kubenswrapper[4724]: I1002 12:59:40.720751 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 12:59:40 crc kubenswrapper[4724]: I1002 12:59:40.720760 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 12:59:40 crc kubenswrapper[4724]: I1002 12:59:40.720782 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 12:59:40 crc kubenswrapper[4724]: I1002 12:59:40.720797 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T12:59:40Z","lastTransitionTime":"2025-10-02T12:59:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 12:59:40 crc kubenswrapper[4724]: I1002 12:59:40.823839 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 12:59:40 crc kubenswrapper[4724]: I1002 12:59:40.823896 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 12:59:40 crc kubenswrapper[4724]: I1002 12:59:40.823912 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 12:59:40 crc kubenswrapper[4724]: I1002 12:59:40.823942 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 12:59:40 crc kubenswrapper[4724]: I1002 12:59:40.823961 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T12:59:40Z","lastTransitionTime":"2025-10-02T12:59:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 12:59:40 crc kubenswrapper[4724]: I1002 12:59:40.928013 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 12:59:40 crc kubenswrapper[4724]: I1002 12:59:40.928117 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 12:59:40 crc kubenswrapper[4724]: I1002 12:59:40.928159 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 12:59:40 crc kubenswrapper[4724]: I1002 12:59:40.928195 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 12:59:40 crc kubenswrapper[4724]: I1002 12:59:40.928216 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T12:59:40Z","lastTransitionTime":"2025-10-02T12:59:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 12:59:41 crc kubenswrapper[4724]: I1002 12:59:41.031491 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 12:59:41 crc kubenswrapper[4724]: I1002 12:59:41.031559 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 12:59:41 crc kubenswrapper[4724]: I1002 12:59:41.031570 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 12:59:41 crc kubenswrapper[4724]: I1002 12:59:41.031589 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 12:59:41 crc kubenswrapper[4724]: I1002 12:59:41.031599 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T12:59:41Z","lastTransitionTime":"2025-10-02T12:59:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 12:59:41 crc kubenswrapper[4724]: I1002 12:59:41.135294 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 12:59:41 crc kubenswrapper[4724]: I1002 12:59:41.135364 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 12:59:41 crc kubenswrapper[4724]: I1002 12:59:41.135388 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 12:59:41 crc kubenswrapper[4724]: I1002 12:59:41.135420 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 12:59:41 crc kubenswrapper[4724]: I1002 12:59:41.135455 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T12:59:41Z","lastTransitionTime":"2025-10-02T12:59:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 12:59:41 crc kubenswrapper[4724]: I1002 12:59:41.238962 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 12:59:41 crc kubenswrapper[4724]: I1002 12:59:41.239047 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 12:59:41 crc kubenswrapper[4724]: I1002 12:59:41.239073 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 12:59:41 crc kubenswrapper[4724]: I1002 12:59:41.239108 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 12:59:41 crc kubenswrapper[4724]: I1002 12:59:41.239134 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T12:59:41Z","lastTransitionTime":"2025-10-02T12:59:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 12:59:41 crc kubenswrapper[4724]: I1002 12:59:41.312862 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 12:59:41 crc kubenswrapper[4724]: I1002 12:59:41.312934 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-q7t2t" Oct 02 12:59:41 crc kubenswrapper[4724]: I1002 12:59:41.313032 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 12:59:41 crc kubenswrapper[4724]: E1002 12:59:41.313021 4724 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 02 12:59:41 crc kubenswrapper[4724]: I1002 12:59:41.313145 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 12:59:41 crc kubenswrapper[4724]: E1002 12:59:41.313272 4724 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-q7t2t" podUID="32e04071-6b34-4fc0-9783-f346a72fcf99" Oct 02 12:59:41 crc kubenswrapper[4724]: E1002 12:59:41.313372 4724 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 02 12:59:41 crc kubenswrapper[4724]: E1002 12:59:41.313770 4724 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 02 12:59:41 crc kubenswrapper[4724]: I1002 12:59:41.314072 4724 scope.go:117] "RemoveContainer" containerID="8a3585f00b9d5a19821986d13b0d46f0db3bdd6eef3b4e28f3a63449e0aa0e55" Oct 02 12:59:41 crc kubenswrapper[4724]: I1002 12:59:41.341785 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 12:59:41 crc kubenswrapper[4724]: I1002 12:59:41.341864 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 12:59:41 crc kubenswrapper[4724]: I1002 12:59:41.341884 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 12:59:41 crc kubenswrapper[4724]: I1002 12:59:41.341910 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 12:59:41 crc kubenswrapper[4724]: I1002 12:59:41.341932 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T12:59:41Z","lastTransitionTime":"2025-10-02T12:59:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 12:59:41 crc kubenswrapper[4724]: I1002 12:59:41.445850 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 12:59:41 crc kubenswrapper[4724]: I1002 12:59:41.445918 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 12:59:41 crc kubenswrapper[4724]: I1002 12:59:41.445933 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 12:59:41 crc kubenswrapper[4724]: I1002 12:59:41.445956 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 12:59:41 crc kubenswrapper[4724]: I1002 12:59:41.445974 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T12:59:41Z","lastTransitionTime":"2025-10-02T12:59:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 12:59:41 crc kubenswrapper[4724]: I1002 12:59:41.548961 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 12:59:41 crc kubenswrapper[4724]: I1002 12:59:41.549012 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 12:59:41 crc kubenswrapper[4724]: I1002 12:59:41.549028 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 12:59:41 crc kubenswrapper[4724]: I1002 12:59:41.549082 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 12:59:41 crc kubenswrapper[4724]: I1002 12:59:41.549095 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T12:59:41Z","lastTransitionTime":"2025-10-02T12:59:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 12:59:41 crc kubenswrapper[4724]: I1002 12:59:41.652255 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 12:59:41 crc kubenswrapper[4724]: I1002 12:59:41.652305 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 12:59:41 crc kubenswrapper[4724]: I1002 12:59:41.652321 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 12:59:41 crc kubenswrapper[4724]: I1002 12:59:41.652348 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 12:59:41 crc kubenswrapper[4724]: I1002 12:59:41.652362 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T12:59:41Z","lastTransitionTime":"2025-10-02T12:59:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 12:59:41 crc kubenswrapper[4724]: I1002 12:59:41.689707 4724 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Oct 02 12:59:41 crc kubenswrapper[4724]: I1002 12:59:41.692436 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"70f8bb8f5d842643df282ff4c021f3aeba0de7dddc0864e3c1deaf64e604fccc"} Oct 02 12:59:41 crc kubenswrapper[4724]: I1002 12:59:41.692947 4724 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 02 12:59:41 crc kubenswrapper[4724]: I1002 12:59:41.706308 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"92a0e520-2ed8-4d81-8de4-c0f98916b012\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:58:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:58:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:58:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8d0d24fc1f56963b40bca48e3e923a5b35f9bec48f0b4fabbbdc9163762b2aa6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:58:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cdf23aa5cabd12ba0af299a7184df0341b5a7f7fb6f8159a10c932599fcef08b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:58:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a087f1c52a561564ac405ef652e8cb1542077e507acff22e1189f7370b7ca322\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:58:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a675aca2218d07044b6e9a391f2b60f5cc822790011af97b0c0b704c1bd98d78\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:58:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T12:58:56Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:41Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:41 crc kubenswrapper[4724]: I1002 12:59:41.716583 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-2mrjk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e0d209f5-ad55-48f5-b4de-51aa5a972c19\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e94572c741d70917793b2482faf4c7fba57dfc4a29ade8824b35109735b602c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8f48\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T12:59:21Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-2mrjk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:41Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:41 crc kubenswrapper[4724]: I1002 12:59:41.728878 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:19Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:41Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:41 crc kubenswrapper[4724]: I1002 12:59:41.744548 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-829dv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b90b9e02-3565-4ad5-8f8c-eec339fc499c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://12d1d4558bef1a0a974bf27f9547c646e216ed5001227ab980e89b05c2a7b5ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vjwcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://55a18158dd67ef4667e4a6bc9684c4f4824f848b6e151a0f99c73e5ecfcc5ab1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://55a18158dd67ef4667e4a6bc9684c4f4824f848b6e151a0f99c73e5ecfcc5ab1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T12:59:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T12:59:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vjwcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://db711940b17fde8f8a00b44de8366688eb730facc06a174bf09456c12c673eb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://db711940b17fde8f8a00b44de8366688eb730facc06a174bf09456c12c673eb0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T12:59:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T12:59:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vjwcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a04ebb8ce267454f3068833092d9644e22484d6d3ea6c89e20fe396e5ea1fe9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a04ebb8ce267454f3068833092d9644e22484d6d3ea6c89e20fe396e5ea1fe9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T12:59:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T12:59:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vjwcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://771dcee276aac3571cff4769668348ea4a10d51e792d1604a47c57edac172dec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://771dcee276aac3571cff4769668348ea4a10d51e792d1604a47c57edac172dec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T12:59:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T12:59:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vjwcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c8b3f0e84cfea43942d32d84b09db7348002648527ea7d65a716489f8507eeb5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c8b3f0e84cfea43942d32d84b09db7348002648527ea7d65a716489f8507eeb5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T12:59:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T12:59:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vjwcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1271c0f5d096f2fc134e178807b44e3a5980ca67d64f0e0d73f566b348206a55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1271c0f5d096f2fc134e178807b44e3a5980ca67d64f0e0d73f566b348206a55\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T12:59:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T12:59:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vjwcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T12:59:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-829dv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:41Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:41 crc kubenswrapper[4724]: I1002 12:59:41.755433 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 12:59:41 crc kubenswrapper[4724]: I1002 12:59:41.755496 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 12:59:41 crc kubenswrapper[4724]: I1002 12:59:41.755515 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 12:59:41 crc kubenswrapper[4724]: I1002 12:59:41.755568 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 12:59:41 crc kubenswrapper[4724]: I1002 12:59:41.755583 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T12:59:41Z","lastTransitionTime":"2025-10-02T12:59:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 12:59:41 crc kubenswrapper[4724]: I1002 12:59:41.760676 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-74k4t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6090eaa-c182-4788-950c-16352c271233\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a3396cd890f4ef8af351a4a3065c2324c20bddc7e079bb78e5a02e9fc8269c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fb6lr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4d5dbca156adf53ca2d3b237e15b6dd4387249e73dbf1a7fb3e7e39c53d1b548\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fb6lr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T12:59:21Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-74k4t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:41Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:41 crc kubenswrapper[4724]: I1002 12:59:41.775757 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-q7t2t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"32e04071-6b34-4fc0-9783-f346a72fcf99\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpmhc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpmhc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T12:59:35Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-q7t2t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:41Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:41 crc kubenswrapper[4724]: I1002 12:59:41.793186 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df76441bbfa05afa242a77e10e6f855d811a9432b790630a9fa5f359ccb6d457\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:41Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:41 crc kubenswrapper[4724]: I1002 12:59:41.810118 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://16ca55ea7e67fb6219b1c835d960d1f9ec6a18a98789ee961e44a278fad9add7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://72eefb1c1aad15d7ad34aa4384c7d0a20a4c886225cb7d28f0597cf01ad35693\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:41Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:41 crc kubenswrapper[4724]: I1002 12:59:41.825611 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1527f3c8-7fa1-404b-aafd-7cac512df49d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:58:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:58:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:58:56Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:58:56Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:58:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f058a4a1cf35b7f4800f872139322f2b0096d67548945ba03ecfe6775ffd5103\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:58:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c06801afff9aa03ae9e0bcd8a966f454d81fdfe876806eed22e9d93132989dae\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:58:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff4c45bdd946a4d50c7dc93752d3d1a71b10b830611b64e138a5e1c0eb9e5e30\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:58:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://70f8bb8f5d842643df282ff4c021f3aeba0de7dddc0864e3c1deaf64e604fccc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a3585f00b9d5a19821986d13b0d46f0db3bdd6eef3b4e28f3a63449e0aa0e55\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-02T12:59:19Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1002 12:59:13.975124 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1002 12:59:13.976423 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1894517521/tls.crt::/tmp/serving-cert-1894517521/tls.key\\\\\\\"\\\\nI1002 12:59:19.332110 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1002 12:59:19.335107 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1002 12:59:19.335132 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1002 12:59:19.335161 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1002 12:59:19.335167 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1002 12:59:19.339404 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1002 12:59:19.339433 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 12:59:19.339437 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 12:59:19.339442 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1002 12:59:19.339446 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1002 12:59:19.339452 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1002 12:59:19.339454 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1002 12:59:19.339646 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1002 12:59:19.342508 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T12:59:02Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://25e74bd1660931f7bb8dc824f985c539abcbd4c706a11edf7192876b8796e751\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:58:59Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://366ed0b9a29ba0b7606888087909b51b39abd60dd6b11e6509b8613913461b6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://366ed0b9a29ba0b7606888087909b51b39abd60dd6b11e6509b8613913461b6b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T12:58:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T12:58:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T12:58:56Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:41Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:41 crc kubenswrapper[4724]: I1002 12:59:41.844578 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-w58lt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4089ad23-969c-4222-a8ed-e141ec291e80\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://197d101ac60dc4e22bd91f0bccaf0c5d9461d2bb94425b5c6d4ca75aad0f7e25\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sxqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3940d1762b165816101228f10689c4897ca3909b295c51368935e2d28d32e9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sxqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://300ea92580dcf51b685b378397b0dd067899d424f4394bce09a8242415acba64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sxqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6bc33e00e91c1177fe2762cb5295c78f4b00c1861d44446269e2e7668b25ac83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sxqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f4412d5281a9c03ea5b251d17371e363b9c5a970bad0db12adf3f75ddd1f955\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sxqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4186fbd2d282edbe581a4d8d3616dff86c3e6cdbe797999fa57f6034870e7742\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sxqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b0f2abd22b65316a586899f71ac99b5451e97b96ff0c19ac2aabe1bb9002e7cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b0f2abd22b65316a586899f71ac99b5451e97b96ff0c19ac2aabe1bb9002e7cf\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-02T12:59:34Z\\\",\\\"message\\\":\\\"nfig-operator_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-machine-config-operator/machine-config-operator\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.4.183\\\\\\\", Port:9001, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI1002 12:59:33.608180 6213 obj_retry.go:386] Retry successful for *v1.Pod openshift-ovn-kubernetes/ovnkube-node-w58lt after 0 failed attempt(s)\\\\nI1002 12:59:33.609167 6213 services_controller.go:452] Built service openshift-machine-config-operator/machine-config-operator per-node LB for network=default: []services.LB{}\\\\nI1002 12:59:33.609171 6213 default_network_controller.go:776] Recording success event on pod openshift-ovn-kubernetes/ovnkube-node-w58lt\\\\nF1002 12:59:33.608123 6213 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, hand\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T12:59:32Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-w58lt_openshift-ovn-kubernetes(4089ad23-969c-4222-a8ed-e141ec291e80)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sxqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c6cd68ea9e76b6dcb1649fafb808953b30389fcaee674b611c655c239a36e7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sxqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9c7841b20007f0436754e669fb6f1a85de1f0ced1c0b2686ae0a20b56ab878f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9c7841b20007f0436754e669fb6f1a85de1f0ced1c0b2686ae0a20b56ab878f2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T12:59:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T12:59:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sxqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T12:59:21Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-w58lt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:41Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:41 crc kubenswrapper[4724]: I1002 12:59:41.858803 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 12:59:41 crc kubenswrapper[4724]: I1002 12:59:41.859084 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 12:59:41 crc kubenswrapper[4724]: I1002 12:59:41.859084 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:19Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:41Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:41 crc kubenswrapper[4724]: I1002 12:59:41.859214 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 12:59:41 crc kubenswrapper[4724]: I1002 12:59:41.859456 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 12:59:41 crc kubenswrapper[4724]: I1002 12:59:41.859479 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T12:59:41Z","lastTransitionTime":"2025-10-02T12:59:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 12:59:41 crc kubenswrapper[4724]: I1002 12:59:41.874077 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-pr276" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c67e3c27-01fd-47f5-a7fb-a2ae1e7753d4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5fe9713cb9b004911140ef4802a6755a4d2a781fde439d7318ace93e4b608cef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c844q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T12:59:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-pr276\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:41Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:41 crc kubenswrapper[4724]: I1002 12:59:41.886804 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-vv7gr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"58e6c559-83ff-48ec-b337-ddd00852bc3c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://540350e6cb29bb395f31338d08626d3b110cc3adbe0df070045d178281d9c035\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4spmh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T12:59:26Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-vv7gr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:41Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:41 crc kubenswrapper[4724]: I1002 12:59:41.900297 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-z9692" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aeccba6f-b4bb-4dd5-8ab1-798d7a67251a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://116acf2f64bac5bb00b87c2bf09f16dcec811d4effbacb2ad49e4d2656180b54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znd9t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5971fadd3a2831fd568f1a0e53cefad9dbb4a9c5ad719b4024ec27381842f92e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znd9t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T12:59:33Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-z9692\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:41Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:41 crc kubenswrapper[4724]: I1002 12:59:41.918484 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8902618f9e9502ff68d0658e199abb68025c551774c41022ed9f3f15dcccba47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:41Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:41 crc kubenswrapper[4724]: I1002 12:59:41.935685 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:41Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:41 crc kubenswrapper[4724]: I1002 12:59:41.962088 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 12:59:41 crc kubenswrapper[4724]: I1002 12:59:41.962129 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 12:59:41 crc kubenswrapper[4724]: I1002 12:59:41.962140 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 12:59:41 crc kubenswrapper[4724]: I1002 12:59:41.962156 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 12:59:41 crc kubenswrapper[4724]: I1002 12:59:41.962166 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T12:59:41Z","lastTransitionTime":"2025-10-02T12:59:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 12:59:42 crc kubenswrapper[4724]: I1002 12:59:42.065325 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 12:59:42 crc kubenswrapper[4724]: I1002 12:59:42.065381 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 12:59:42 crc kubenswrapper[4724]: I1002 12:59:42.065393 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 12:59:42 crc kubenswrapper[4724]: I1002 12:59:42.065413 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 12:59:42 crc kubenswrapper[4724]: I1002 12:59:42.065424 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T12:59:42Z","lastTransitionTime":"2025-10-02T12:59:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 12:59:42 crc kubenswrapper[4724]: I1002 12:59:42.169118 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 12:59:42 crc kubenswrapper[4724]: I1002 12:59:42.169169 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 12:59:42 crc kubenswrapper[4724]: I1002 12:59:42.169180 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 12:59:42 crc kubenswrapper[4724]: I1002 12:59:42.169200 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 12:59:42 crc kubenswrapper[4724]: I1002 12:59:42.169213 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T12:59:42Z","lastTransitionTime":"2025-10-02T12:59:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 12:59:42 crc kubenswrapper[4724]: I1002 12:59:42.272474 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 12:59:42 crc kubenswrapper[4724]: I1002 12:59:42.272512 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 12:59:42 crc kubenswrapper[4724]: I1002 12:59:42.272522 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 12:59:42 crc kubenswrapper[4724]: I1002 12:59:42.272550 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 12:59:42 crc kubenswrapper[4724]: I1002 12:59:42.272561 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T12:59:42Z","lastTransitionTime":"2025-10-02T12:59:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 12:59:42 crc kubenswrapper[4724]: I1002 12:59:42.376150 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 12:59:42 crc kubenswrapper[4724]: I1002 12:59:42.376196 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 12:59:42 crc kubenswrapper[4724]: I1002 12:59:42.376208 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 12:59:42 crc kubenswrapper[4724]: I1002 12:59:42.376227 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 12:59:42 crc kubenswrapper[4724]: I1002 12:59:42.376240 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T12:59:42Z","lastTransitionTime":"2025-10-02T12:59:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 12:59:42 crc kubenswrapper[4724]: I1002 12:59:42.479587 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 12:59:42 crc kubenswrapper[4724]: I1002 12:59:42.479642 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 12:59:42 crc kubenswrapper[4724]: I1002 12:59:42.479652 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 12:59:42 crc kubenswrapper[4724]: I1002 12:59:42.479668 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 12:59:42 crc kubenswrapper[4724]: I1002 12:59:42.479678 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T12:59:42Z","lastTransitionTime":"2025-10-02T12:59:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 12:59:42 crc kubenswrapper[4724]: I1002 12:59:42.582832 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 12:59:42 crc kubenswrapper[4724]: I1002 12:59:42.582895 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 12:59:42 crc kubenswrapper[4724]: I1002 12:59:42.582905 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 12:59:42 crc kubenswrapper[4724]: I1002 12:59:42.582923 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 12:59:42 crc kubenswrapper[4724]: I1002 12:59:42.582933 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T12:59:42Z","lastTransitionTime":"2025-10-02T12:59:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 12:59:42 crc kubenswrapper[4724]: I1002 12:59:42.686032 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 12:59:42 crc kubenswrapper[4724]: I1002 12:59:42.686081 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 12:59:42 crc kubenswrapper[4724]: I1002 12:59:42.686092 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 12:59:42 crc kubenswrapper[4724]: I1002 12:59:42.686112 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 12:59:42 crc kubenswrapper[4724]: I1002 12:59:42.686124 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T12:59:42Z","lastTransitionTime":"2025-10-02T12:59:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 12:59:42 crc kubenswrapper[4724]: I1002 12:59:42.788948 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 12:59:42 crc kubenswrapper[4724]: I1002 12:59:42.789006 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 12:59:42 crc kubenswrapper[4724]: I1002 12:59:42.789018 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 12:59:42 crc kubenswrapper[4724]: I1002 12:59:42.789037 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 12:59:42 crc kubenswrapper[4724]: I1002 12:59:42.789050 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T12:59:42Z","lastTransitionTime":"2025-10-02T12:59:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 12:59:42 crc kubenswrapper[4724]: I1002 12:59:42.892854 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 12:59:42 crc kubenswrapper[4724]: I1002 12:59:42.892916 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 12:59:42 crc kubenswrapper[4724]: I1002 12:59:42.892929 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 12:59:42 crc kubenswrapper[4724]: I1002 12:59:42.892950 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 12:59:42 crc kubenswrapper[4724]: I1002 12:59:42.892963 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T12:59:42Z","lastTransitionTime":"2025-10-02T12:59:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 12:59:42 crc kubenswrapper[4724]: I1002 12:59:42.996259 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 12:59:42 crc kubenswrapper[4724]: I1002 12:59:42.996324 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 12:59:42 crc kubenswrapper[4724]: I1002 12:59:42.996336 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 12:59:42 crc kubenswrapper[4724]: I1002 12:59:42.996358 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 12:59:42 crc kubenswrapper[4724]: I1002 12:59:42.996372 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T12:59:42Z","lastTransitionTime":"2025-10-02T12:59:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 12:59:43 crc kubenswrapper[4724]: I1002 12:59:43.085729 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/32e04071-6b34-4fc0-9783-f346a72fcf99-metrics-certs\") pod \"network-metrics-daemon-q7t2t\" (UID: \"32e04071-6b34-4fc0-9783-f346a72fcf99\") " pod="openshift-multus/network-metrics-daemon-q7t2t" Oct 02 12:59:43 crc kubenswrapper[4724]: E1002 12:59:43.085891 4724 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Oct 02 12:59:43 crc kubenswrapper[4724]: E1002 12:59:43.085969 4724 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/32e04071-6b34-4fc0-9783-f346a72fcf99-metrics-certs podName:32e04071-6b34-4fc0-9783-f346a72fcf99 nodeName:}" failed. No retries permitted until 2025-10-02 12:59:51.085951975 +0000 UTC m=+55.540711096 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/32e04071-6b34-4fc0-9783-f346a72fcf99-metrics-certs") pod "network-metrics-daemon-q7t2t" (UID: "32e04071-6b34-4fc0-9783-f346a72fcf99") : object "openshift-multus"/"metrics-daemon-secret" not registered Oct 02 12:59:43 crc kubenswrapper[4724]: I1002 12:59:43.099131 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 12:59:43 crc kubenswrapper[4724]: I1002 12:59:43.099189 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 12:59:43 crc kubenswrapper[4724]: I1002 12:59:43.099205 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 12:59:43 crc kubenswrapper[4724]: I1002 12:59:43.099229 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 12:59:43 crc kubenswrapper[4724]: I1002 12:59:43.099244 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T12:59:43Z","lastTransitionTime":"2025-10-02T12:59:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 12:59:43 crc kubenswrapper[4724]: I1002 12:59:43.202450 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 12:59:43 crc kubenswrapper[4724]: I1002 12:59:43.202499 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 12:59:43 crc kubenswrapper[4724]: I1002 12:59:43.202511 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 12:59:43 crc kubenswrapper[4724]: I1002 12:59:43.202529 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 12:59:43 crc kubenswrapper[4724]: I1002 12:59:43.202562 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T12:59:43Z","lastTransitionTime":"2025-10-02T12:59:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 12:59:43 crc kubenswrapper[4724]: I1002 12:59:43.305916 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 12:59:43 crc kubenswrapper[4724]: I1002 12:59:43.305985 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 12:59:43 crc kubenswrapper[4724]: I1002 12:59:43.306000 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 12:59:43 crc kubenswrapper[4724]: I1002 12:59:43.306022 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 12:59:43 crc kubenswrapper[4724]: I1002 12:59:43.306035 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T12:59:43Z","lastTransitionTime":"2025-10-02T12:59:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 12:59:43 crc kubenswrapper[4724]: I1002 12:59:43.313347 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 12:59:43 crc kubenswrapper[4724]: I1002 12:59:43.313387 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 12:59:43 crc kubenswrapper[4724]: I1002 12:59:43.313357 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 12:59:43 crc kubenswrapper[4724]: I1002 12:59:43.313521 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-q7t2t" Oct 02 12:59:43 crc kubenswrapper[4724]: E1002 12:59:43.313696 4724 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 02 12:59:43 crc kubenswrapper[4724]: E1002 12:59:43.313850 4724 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 02 12:59:43 crc kubenswrapper[4724]: E1002 12:59:43.314038 4724 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 02 12:59:43 crc kubenswrapper[4724]: E1002 12:59:43.314138 4724 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-q7t2t" podUID="32e04071-6b34-4fc0-9783-f346a72fcf99" Oct 02 12:59:43 crc kubenswrapper[4724]: I1002 12:59:43.408389 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 12:59:43 crc kubenswrapper[4724]: I1002 12:59:43.408427 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 12:59:43 crc kubenswrapper[4724]: I1002 12:59:43.408437 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 12:59:43 crc kubenswrapper[4724]: I1002 12:59:43.408455 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 12:59:43 crc kubenswrapper[4724]: I1002 12:59:43.408466 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T12:59:43Z","lastTransitionTime":"2025-10-02T12:59:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 12:59:43 crc kubenswrapper[4724]: I1002 12:59:43.511982 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 12:59:43 crc kubenswrapper[4724]: I1002 12:59:43.512040 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 12:59:43 crc kubenswrapper[4724]: I1002 12:59:43.512051 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 12:59:43 crc kubenswrapper[4724]: I1002 12:59:43.512070 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 12:59:43 crc kubenswrapper[4724]: I1002 12:59:43.512083 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T12:59:43Z","lastTransitionTime":"2025-10-02T12:59:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 12:59:43 crc kubenswrapper[4724]: I1002 12:59:43.615927 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 12:59:43 crc kubenswrapper[4724]: I1002 12:59:43.615987 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 12:59:43 crc kubenswrapper[4724]: I1002 12:59:43.615999 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 12:59:43 crc kubenswrapper[4724]: I1002 12:59:43.616028 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 12:59:43 crc kubenswrapper[4724]: I1002 12:59:43.616055 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T12:59:43Z","lastTransitionTime":"2025-10-02T12:59:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 12:59:43 crc kubenswrapper[4724]: I1002 12:59:43.718943 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 12:59:43 crc kubenswrapper[4724]: I1002 12:59:43.718993 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 12:59:43 crc kubenswrapper[4724]: I1002 12:59:43.719010 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 12:59:43 crc kubenswrapper[4724]: I1002 12:59:43.719034 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 12:59:43 crc kubenswrapper[4724]: I1002 12:59:43.719045 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T12:59:43Z","lastTransitionTime":"2025-10-02T12:59:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 12:59:43 crc kubenswrapper[4724]: I1002 12:59:43.822423 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 12:59:43 crc kubenswrapper[4724]: I1002 12:59:43.822491 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 12:59:43 crc kubenswrapper[4724]: I1002 12:59:43.822507 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 12:59:43 crc kubenswrapper[4724]: I1002 12:59:43.822566 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 12:59:43 crc kubenswrapper[4724]: I1002 12:59:43.822592 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T12:59:43Z","lastTransitionTime":"2025-10-02T12:59:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 12:59:43 crc kubenswrapper[4724]: I1002 12:59:43.925295 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 12:59:43 crc kubenswrapper[4724]: I1002 12:59:43.925341 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 12:59:43 crc kubenswrapper[4724]: I1002 12:59:43.925354 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 12:59:43 crc kubenswrapper[4724]: I1002 12:59:43.925375 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 12:59:43 crc kubenswrapper[4724]: I1002 12:59:43.925393 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T12:59:43Z","lastTransitionTime":"2025-10-02T12:59:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 12:59:44 crc kubenswrapper[4724]: I1002 12:59:44.028650 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 12:59:44 crc kubenswrapper[4724]: I1002 12:59:44.028688 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 12:59:44 crc kubenswrapper[4724]: I1002 12:59:44.028695 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 12:59:44 crc kubenswrapper[4724]: I1002 12:59:44.028709 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 12:59:44 crc kubenswrapper[4724]: I1002 12:59:44.028721 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T12:59:44Z","lastTransitionTime":"2025-10-02T12:59:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 12:59:44 crc kubenswrapper[4724]: I1002 12:59:44.131714 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 12:59:44 crc kubenswrapper[4724]: I1002 12:59:44.131793 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 12:59:44 crc kubenswrapper[4724]: I1002 12:59:44.131810 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 12:59:44 crc kubenswrapper[4724]: I1002 12:59:44.131840 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 12:59:44 crc kubenswrapper[4724]: I1002 12:59:44.131858 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T12:59:44Z","lastTransitionTime":"2025-10-02T12:59:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 12:59:44 crc kubenswrapper[4724]: I1002 12:59:44.236013 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 12:59:44 crc kubenswrapper[4724]: I1002 12:59:44.236088 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 12:59:44 crc kubenswrapper[4724]: I1002 12:59:44.236101 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 12:59:44 crc kubenswrapper[4724]: I1002 12:59:44.236120 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 12:59:44 crc kubenswrapper[4724]: I1002 12:59:44.236131 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T12:59:44Z","lastTransitionTime":"2025-10-02T12:59:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 12:59:44 crc kubenswrapper[4724]: I1002 12:59:44.339883 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 12:59:44 crc kubenswrapper[4724]: I1002 12:59:44.339941 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 12:59:44 crc kubenswrapper[4724]: I1002 12:59:44.339958 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 12:59:44 crc kubenswrapper[4724]: I1002 12:59:44.339983 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 12:59:44 crc kubenswrapper[4724]: I1002 12:59:44.339999 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T12:59:44Z","lastTransitionTime":"2025-10-02T12:59:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 12:59:44 crc kubenswrapper[4724]: I1002 12:59:44.442983 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 12:59:44 crc kubenswrapper[4724]: I1002 12:59:44.443042 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 12:59:44 crc kubenswrapper[4724]: I1002 12:59:44.443061 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 12:59:44 crc kubenswrapper[4724]: I1002 12:59:44.443082 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 12:59:44 crc kubenswrapper[4724]: I1002 12:59:44.443096 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T12:59:44Z","lastTransitionTime":"2025-10-02T12:59:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 12:59:44 crc kubenswrapper[4724]: I1002 12:59:44.545498 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 12:59:44 crc kubenswrapper[4724]: I1002 12:59:44.545561 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 12:59:44 crc kubenswrapper[4724]: I1002 12:59:44.545571 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 12:59:44 crc kubenswrapper[4724]: I1002 12:59:44.545587 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 12:59:44 crc kubenswrapper[4724]: I1002 12:59:44.545603 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T12:59:44Z","lastTransitionTime":"2025-10-02T12:59:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 12:59:44 crc kubenswrapper[4724]: I1002 12:59:44.648356 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 12:59:44 crc kubenswrapper[4724]: I1002 12:59:44.648403 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 12:59:44 crc kubenswrapper[4724]: I1002 12:59:44.648414 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 12:59:44 crc kubenswrapper[4724]: I1002 12:59:44.648430 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 12:59:44 crc kubenswrapper[4724]: I1002 12:59:44.648442 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T12:59:44Z","lastTransitionTime":"2025-10-02T12:59:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 12:59:44 crc kubenswrapper[4724]: I1002 12:59:44.750791 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 12:59:44 crc kubenswrapper[4724]: I1002 12:59:44.750837 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 12:59:44 crc kubenswrapper[4724]: I1002 12:59:44.750846 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 12:59:44 crc kubenswrapper[4724]: I1002 12:59:44.750862 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 12:59:44 crc kubenswrapper[4724]: I1002 12:59:44.750873 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T12:59:44Z","lastTransitionTime":"2025-10-02T12:59:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 12:59:44 crc kubenswrapper[4724]: I1002 12:59:44.854084 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 12:59:44 crc kubenswrapper[4724]: I1002 12:59:44.854144 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 12:59:44 crc kubenswrapper[4724]: I1002 12:59:44.854153 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 12:59:44 crc kubenswrapper[4724]: I1002 12:59:44.854170 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 12:59:44 crc kubenswrapper[4724]: I1002 12:59:44.854180 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T12:59:44Z","lastTransitionTime":"2025-10-02T12:59:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 12:59:44 crc kubenswrapper[4724]: I1002 12:59:44.956882 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 12:59:44 crc kubenswrapper[4724]: I1002 12:59:44.956956 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 12:59:44 crc kubenswrapper[4724]: I1002 12:59:44.956975 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 12:59:44 crc kubenswrapper[4724]: I1002 12:59:44.956996 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 12:59:44 crc kubenswrapper[4724]: I1002 12:59:44.957012 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T12:59:44Z","lastTransitionTime":"2025-10-02T12:59:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 12:59:45 crc kubenswrapper[4724]: I1002 12:59:45.064202 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 12:59:45 crc kubenswrapper[4724]: I1002 12:59:45.064241 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 12:59:45 crc kubenswrapper[4724]: I1002 12:59:45.064250 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 12:59:45 crc kubenswrapper[4724]: I1002 12:59:45.064265 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 12:59:45 crc kubenswrapper[4724]: I1002 12:59:45.064275 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T12:59:45Z","lastTransitionTime":"2025-10-02T12:59:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 12:59:45 crc kubenswrapper[4724]: I1002 12:59:45.167659 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 12:59:45 crc kubenswrapper[4724]: I1002 12:59:45.167733 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 12:59:45 crc kubenswrapper[4724]: I1002 12:59:45.167756 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 12:59:45 crc kubenswrapper[4724]: I1002 12:59:45.167786 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 12:59:45 crc kubenswrapper[4724]: I1002 12:59:45.167808 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T12:59:45Z","lastTransitionTime":"2025-10-02T12:59:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 12:59:45 crc kubenswrapper[4724]: I1002 12:59:45.271137 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 12:59:45 crc kubenswrapper[4724]: I1002 12:59:45.271212 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 12:59:45 crc kubenswrapper[4724]: I1002 12:59:45.271234 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 12:59:45 crc kubenswrapper[4724]: I1002 12:59:45.271259 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 12:59:45 crc kubenswrapper[4724]: I1002 12:59:45.271277 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T12:59:45Z","lastTransitionTime":"2025-10-02T12:59:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 12:59:45 crc kubenswrapper[4724]: I1002 12:59:45.313376 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 12:59:45 crc kubenswrapper[4724]: I1002 12:59:45.313454 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 12:59:45 crc kubenswrapper[4724]: E1002 12:59:45.313528 4724 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 02 12:59:45 crc kubenswrapper[4724]: I1002 12:59:45.313492 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-q7t2t" Oct 02 12:59:45 crc kubenswrapper[4724]: I1002 12:59:45.313466 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 12:59:45 crc kubenswrapper[4724]: E1002 12:59:45.313660 4724 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 02 12:59:45 crc kubenswrapper[4724]: E1002 12:59:45.313695 4724 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-q7t2t" podUID="32e04071-6b34-4fc0-9783-f346a72fcf99" Oct 02 12:59:45 crc kubenswrapper[4724]: E1002 12:59:45.313877 4724 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 02 12:59:45 crc kubenswrapper[4724]: I1002 12:59:45.374706 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 12:59:45 crc kubenswrapper[4724]: I1002 12:59:45.374780 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 12:59:45 crc kubenswrapper[4724]: I1002 12:59:45.374798 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 12:59:45 crc kubenswrapper[4724]: I1002 12:59:45.374825 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 12:59:45 crc kubenswrapper[4724]: I1002 12:59:45.374845 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T12:59:45Z","lastTransitionTime":"2025-10-02T12:59:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 12:59:45 crc kubenswrapper[4724]: I1002 12:59:45.479028 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 12:59:45 crc kubenswrapper[4724]: I1002 12:59:45.479086 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 12:59:45 crc kubenswrapper[4724]: I1002 12:59:45.479098 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 12:59:45 crc kubenswrapper[4724]: I1002 12:59:45.479152 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 12:59:45 crc kubenswrapper[4724]: I1002 12:59:45.479170 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T12:59:45Z","lastTransitionTime":"2025-10-02T12:59:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 12:59:45 crc kubenswrapper[4724]: I1002 12:59:45.582084 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 12:59:45 crc kubenswrapper[4724]: I1002 12:59:45.582122 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 12:59:45 crc kubenswrapper[4724]: I1002 12:59:45.582133 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 12:59:45 crc kubenswrapper[4724]: I1002 12:59:45.582152 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 12:59:45 crc kubenswrapper[4724]: I1002 12:59:45.582167 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T12:59:45Z","lastTransitionTime":"2025-10-02T12:59:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 12:59:45 crc kubenswrapper[4724]: I1002 12:59:45.684413 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 12:59:45 crc kubenswrapper[4724]: I1002 12:59:45.684469 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 12:59:45 crc kubenswrapper[4724]: I1002 12:59:45.684482 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 12:59:45 crc kubenswrapper[4724]: I1002 12:59:45.684502 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 12:59:45 crc kubenswrapper[4724]: I1002 12:59:45.684516 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T12:59:45Z","lastTransitionTime":"2025-10-02T12:59:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 12:59:45 crc kubenswrapper[4724]: I1002 12:59:45.787402 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 12:59:45 crc kubenswrapper[4724]: I1002 12:59:45.787464 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 12:59:45 crc kubenswrapper[4724]: I1002 12:59:45.787487 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 12:59:45 crc kubenswrapper[4724]: I1002 12:59:45.787513 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 12:59:45 crc kubenswrapper[4724]: I1002 12:59:45.787552 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T12:59:45Z","lastTransitionTime":"2025-10-02T12:59:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 12:59:45 crc kubenswrapper[4724]: I1002 12:59:45.890378 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 12:59:45 crc kubenswrapper[4724]: I1002 12:59:45.890447 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 12:59:45 crc kubenswrapper[4724]: I1002 12:59:45.890462 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 12:59:45 crc kubenswrapper[4724]: I1002 12:59:45.890485 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 12:59:45 crc kubenswrapper[4724]: I1002 12:59:45.890502 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T12:59:45Z","lastTransitionTime":"2025-10-02T12:59:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 12:59:45 crc kubenswrapper[4724]: I1002 12:59:45.993497 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 12:59:45 crc kubenswrapper[4724]: I1002 12:59:45.993563 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 12:59:45 crc kubenswrapper[4724]: I1002 12:59:45.993574 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 12:59:45 crc kubenswrapper[4724]: I1002 12:59:45.993594 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 12:59:45 crc kubenswrapper[4724]: I1002 12:59:45.993608 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T12:59:45Z","lastTransitionTime":"2025-10-02T12:59:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 12:59:46 crc kubenswrapper[4724]: I1002 12:59:46.097321 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 12:59:46 crc kubenswrapper[4724]: I1002 12:59:46.097378 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 12:59:46 crc kubenswrapper[4724]: I1002 12:59:46.097396 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 12:59:46 crc kubenswrapper[4724]: I1002 12:59:46.097422 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 12:59:46 crc kubenswrapper[4724]: I1002 12:59:46.097442 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T12:59:46Z","lastTransitionTime":"2025-10-02T12:59:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 12:59:46 crc kubenswrapper[4724]: I1002 12:59:46.200450 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 12:59:46 crc kubenswrapper[4724]: I1002 12:59:46.200528 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 12:59:46 crc kubenswrapper[4724]: I1002 12:59:46.200586 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 12:59:46 crc kubenswrapper[4724]: I1002 12:59:46.200619 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 12:59:46 crc kubenswrapper[4724]: I1002 12:59:46.200644 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T12:59:46Z","lastTransitionTime":"2025-10-02T12:59:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 12:59:46 crc kubenswrapper[4724]: I1002 12:59:46.303823 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 12:59:46 crc kubenswrapper[4724]: I1002 12:59:46.303897 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 12:59:46 crc kubenswrapper[4724]: I1002 12:59:46.303915 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 12:59:46 crc kubenswrapper[4724]: I1002 12:59:46.303940 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 12:59:46 crc kubenswrapper[4724]: I1002 12:59:46.303954 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T12:59:46Z","lastTransitionTime":"2025-10-02T12:59:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 12:59:46 crc kubenswrapper[4724]: I1002 12:59:46.337380 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"92a0e520-2ed8-4d81-8de4-c0f98916b012\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:58:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:58:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:58:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8d0d24fc1f56963b40bca48e3e923a5b35f9bec48f0b4fabbbdc9163762b2aa6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:58:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cdf23aa5cabd12ba0af299a7184df0341b5a7f7fb6f8159a10c932599fcef08b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:58:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a087f1c52a561564ac405ef652e8cb1542077e507acff22e1189f7370b7ca322\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:58:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a675aca2218d07044b6e9a391f2b60f5cc822790011af97b0c0b704c1bd98d78\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:58:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T12:58:56Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:46Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:46 crc kubenswrapper[4724]: I1002 12:59:46.349708 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-2mrjk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e0d209f5-ad55-48f5-b4de-51aa5a972c19\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e94572c741d70917793b2482faf4c7fba57dfc4a29ade8824b35109735b602c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8f48\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T12:59:21Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-2mrjk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:46Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:46 crc kubenswrapper[4724]: I1002 12:59:46.365737 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df76441bbfa05afa242a77e10e6f855d811a9432b790630a9fa5f359ccb6d457\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:46Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:46 crc kubenswrapper[4724]: I1002 12:59:46.379156 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://16ca55ea7e67fb6219b1c835d960d1f9ec6a18a98789ee961e44a278fad9add7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://72eefb1c1aad15d7ad34aa4384c7d0a20a4c886225cb7d28f0597cf01ad35693\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:46Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:46 crc kubenswrapper[4724]: I1002 12:59:46.398978 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:19Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:46Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:46 crc kubenswrapper[4724]: I1002 12:59:46.405546 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 12:59:46 crc kubenswrapper[4724]: I1002 12:59:46.405595 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 12:59:46 crc kubenswrapper[4724]: I1002 12:59:46.405607 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 12:59:46 crc kubenswrapper[4724]: I1002 12:59:46.405627 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 12:59:46 crc kubenswrapper[4724]: I1002 12:59:46.405640 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T12:59:46Z","lastTransitionTime":"2025-10-02T12:59:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 12:59:46 crc kubenswrapper[4724]: I1002 12:59:46.417832 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-829dv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b90b9e02-3565-4ad5-8f8c-eec339fc499c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://12d1d4558bef1a0a974bf27f9547c646e216ed5001227ab980e89b05c2a7b5ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vjwcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://55a18158dd67ef4667e4a6bc9684c4f4824f848b6e151a0f99c73e5ecfcc5ab1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://55a18158dd67ef4667e4a6bc9684c4f4824f848b6e151a0f99c73e5ecfcc5ab1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T12:59:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T12:59:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vjwcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://db711940b17fde8f8a00b44de8366688eb730facc06a174bf09456c12c673eb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://db711940b17fde8f8a00b44de8366688eb730facc06a174bf09456c12c673eb0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T12:59:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T12:59:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vjwcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a04ebb8ce267454f3068833092d9644e22484d6d3ea6c89e20fe396e5ea1fe9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a04ebb8ce267454f3068833092d9644e22484d6d3ea6c89e20fe396e5ea1fe9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T12:59:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T12:59:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vjwcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://771dcee276aac3571cff4769668348ea4a10d51e792d1604a47c57edac172dec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://771dcee276aac3571cff4769668348ea4a10d51e792d1604a47c57edac172dec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T12:59:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T12:59:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vjwcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c8b3f0e84cfea43942d32d84b09db7348002648527ea7d65a716489f8507eeb5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c8b3f0e84cfea43942d32d84b09db7348002648527ea7d65a716489f8507eeb5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T12:59:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T12:59:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vjwcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1271c0f5d096f2fc134e178807b44e3a5980ca67d64f0e0d73f566b348206a55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1271c0f5d096f2fc134e178807b44e3a5980ca67d64f0e0d73f566b348206a55\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T12:59:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T12:59:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vjwcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T12:59:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-829dv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:46Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:46 crc kubenswrapper[4724]: I1002 12:59:46.431021 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-74k4t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6090eaa-c182-4788-950c-16352c271233\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a3396cd890f4ef8af351a4a3065c2324c20bddc7e079bb78e5a02e9fc8269c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fb6lr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4d5dbca156adf53ca2d3b237e15b6dd4387249e73dbf1a7fb3e7e39c53d1b548\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fb6lr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T12:59:21Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-74k4t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:46Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:46 crc kubenswrapper[4724]: I1002 12:59:46.444621 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-q7t2t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"32e04071-6b34-4fc0-9783-f346a72fcf99\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpmhc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpmhc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T12:59:35Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-q7t2t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:46Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:46 crc kubenswrapper[4724]: I1002 12:59:46.458897 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1527f3c8-7fa1-404b-aafd-7cac512df49d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:58:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:58:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:58:56Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:58:56Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:58:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f058a4a1cf35b7f4800f872139322f2b0096d67548945ba03ecfe6775ffd5103\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:58:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c06801afff9aa03ae9e0bcd8a966f454d81fdfe876806eed22e9d93132989dae\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:58:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff4c45bdd946a4d50c7dc93752d3d1a71b10b830611b64e138a5e1c0eb9e5e30\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:58:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://70f8bb8f5d842643df282ff4c021f3aeba0de7dddc0864e3c1deaf64e604fccc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a3585f00b9d5a19821986d13b0d46f0db3bdd6eef3b4e28f3a63449e0aa0e55\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-02T12:59:19Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1002 12:59:13.975124 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1002 12:59:13.976423 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1894517521/tls.crt::/tmp/serving-cert-1894517521/tls.key\\\\\\\"\\\\nI1002 12:59:19.332110 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1002 12:59:19.335107 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1002 12:59:19.335132 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1002 12:59:19.335161 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1002 12:59:19.335167 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1002 12:59:19.339404 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1002 12:59:19.339433 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 12:59:19.339437 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 12:59:19.339442 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1002 12:59:19.339446 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1002 12:59:19.339452 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1002 12:59:19.339454 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1002 12:59:19.339646 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1002 12:59:19.342508 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T12:59:02Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://25e74bd1660931f7bb8dc824f985c539abcbd4c706a11edf7192876b8796e751\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:58:59Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://366ed0b9a29ba0b7606888087909b51b39abd60dd6b11e6509b8613913461b6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://366ed0b9a29ba0b7606888087909b51b39abd60dd6b11e6509b8613913461b6b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T12:58:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T12:58:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T12:58:56Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:46Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:46 crc kubenswrapper[4724]: I1002 12:59:46.491108 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-w58lt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4089ad23-969c-4222-a8ed-e141ec291e80\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://197d101ac60dc4e22bd91f0bccaf0c5d9461d2bb94425b5c6d4ca75aad0f7e25\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sxqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3940d1762b165816101228f10689c4897ca3909b295c51368935e2d28d32e9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sxqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://300ea92580dcf51b685b378397b0dd067899d424f4394bce09a8242415acba64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sxqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6bc33e00e91c1177fe2762cb5295c78f4b00c1861d44446269e2e7668b25ac83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sxqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f4412d5281a9c03ea5b251d17371e363b9c5a970bad0db12adf3f75ddd1f955\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sxqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4186fbd2d282edbe581a4d8d3616dff86c3e6cdbe797999fa57f6034870e7742\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sxqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b0f2abd22b65316a586899f71ac99b5451e97b96ff0c19ac2aabe1bb9002e7cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b0f2abd22b65316a586899f71ac99b5451e97b96ff0c19ac2aabe1bb9002e7cf\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-02T12:59:34Z\\\",\\\"message\\\":\\\"nfig-operator_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-machine-config-operator/machine-config-operator\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.4.183\\\\\\\", Port:9001, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI1002 12:59:33.608180 6213 obj_retry.go:386] Retry successful for *v1.Pod openshift-ovn-kubernetes/ovnkube-node-w58lt after 0 failed attempt(s)\\\\nI1002 12:59:33.609167 6213 services_controller.go:452] Built service openshift-machine-config-operator/machine-config-operator per-node LB for network=default: []services.LB{}\\\\nI1002 12:59:33.609171 6213 default_network_controller.go:776] Recording success event on pod openshift-ovn-kubernetes/ovnkube-node-w58lt\\\\nF1002 12:59:33.608123 6213 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, hand\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T12:59:32Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-w58lt_openshift-ovn-kubernetes(4089ad23-969c-4222-a8ed-e141ec291e80)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sxqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c6cd68ea9e76b6dcb1649fafb808953b30389fcaee674b611c655c239a36e7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sxqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9c7841b20007f0436754e669fb6f1a85de1f0ced1c0b2686ae0a20b56ab878f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9c7841b20007f0436754e669fb6f1a85de1f0ced1c0b2686ae0a20b56ab878f2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T12:59:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T12:59:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sxqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T12:59:21Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-w58lt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:46Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:46 crc kubenswrapper[4724]: I1002 12:59:46.508584 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 12:59:46 crc kubenswrapper[4724]: I1002 12:59:46.508689 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 12:59:46 crc kubenswrapper[4724]: I1002 12:59:46.508717 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 12:59:46 crc kubenswrapper[4724]: I1002 12:59:46.508756 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 12:59:46 crc kubenswrapper[4724]: I1002 12:59:46.508833 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T12:59:46Z","lastTransitionTime":"2025-10-02T12:59:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 12:59:46 crc kubenswrapper[4724]: I1002 12:59:46.518303 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8902618f9e9502ff68d0658e199abb68025c551774c41022ed9f3f15dcccba47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:46Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:46 crc kubenswrapper[4724]: I1002 12:59:46.542581 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:46Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:46 crc kubenswrapper[4724]: I1002 12:59:46.556381 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:19Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:46Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:46 crc kubenswrapper[4724]: I1002 12:59:46.585476 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-pr276" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c67e3c27-01fd-47f5-a7fb-a2ae1e7753d4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5fe9713cb9b004911140ef4802a6755a4d2a781fde439d7318ace93e4b608cef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c844q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T12:59:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-pr276\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:46Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:46 crc kubenswrapper[4724]: I1002 12:59:46.611041 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-vv7gr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"58e6c559-83ff-48ec-b337-ddd00852bc3c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://540350e6cb29bb395f31338d08626d3b110cc3adbe0df070045d178281d9c035\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4spmh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T12:59:26Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-vv7gr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:46Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:46 crc kubenswrapper[4724]: I1002 12:59:46.611159 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 12:59:46 crc kubenswrapper[4724]: I1002 12:59:46.611189 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 12:59:46 crc kubenswrapper[4724]: I1002 12:59:46.611197 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 12:59:46 crc kubenswrapper[4724]: I1002 12:59:46.611212 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 12:59:46 crc kubenswrapper[4724]: I1002 12:59:46.611223 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T12:59:46Z","lastTransitionTime":"2025-10-02T12:59:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 12:59:46 crc kubenswrapper[4724]: I1002 12:59:46.622307 4724 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 02 12:59:46 crc kubenswrapper[4724]: I1002 12:59:46.624201 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-z9692" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aeccba6f-b4bb-4dd5-8ab1-798d7a67251a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://116acf2f64bac5bb00b87c2bf09f16dcec811d4effbacb2ad49e4d2656180b54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znd9t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5971fadd3a2831fd568f1a0e53cefad9dbb4a9c5ad719b4024ec27381842f92e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znd9t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T12:59:33Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-z9692\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:46Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:46 crc kubenswrapper[4724]: I1002 12:59:46.630018 4724 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/openshift-kube-scheduler-crc"] Oct 02 12:59:46 crc kubenswrapper[4724]: I1002 12:59:46.638492 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"92a0e520-2ed8-4d81-8de4-c0f98916b012\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:58:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:58:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:58:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8d0d24fc1f56963b40bca48e3e923a5b35f9bec48f0b4fabbbdc9163762b2aa6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:58:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cdf23aa5cabd12ba0af299a7184df0341b5a7f7fb6f8159a10c932599fcef08b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:58:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a087f1c52a561564ac405ef652e8cb1542077e507acff22e1189f7370b7ca322\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:58:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a675aca2218d07044b6e9a391f2b60f5cc822790011af97b0c0b704c1bd98d78\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:58:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T12:58:56Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:46Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:46 crc kubenswrapper[4724]: I1002 12:59:46.649561 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-2mrjk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e0d209f5-ad55-48f5-b4de-51aa5a972c19\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e94572c741d70917793b2482faf4c7fba57dfc4a29ade8824b35109735b602c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8f48\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T12:59:21Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-2mrjk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:46Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:46 crc kubenswrapper[4724]: I1002 12:59:46.663071 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df76441bbfa05afa242a77e10e6f855d811a9432b790630a9fa5f359ccb6d457\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:46Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:46 crc kubenswrapper[4724]: I1002 12:59:46.678472 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://16ca55ea7e67fb6219b1c835d960d1f9ec6a18a98789ee961e44a278fad9add7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://72eefb1c1aad15d7ad34aa4384c7d0a20a4c886225cb7d28f0597cf01ad35693\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:46Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:46 crc kubenswrapper[4724]: I1002 12:59:46.692762 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:19Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:46Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:46 crc kubenswrapper[4724]: I1002 12:59:46.710108 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-829dv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b90b9e02-3565-4ad5-8f8c-eec339fc499c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://12d1d4558bef1a0a974bf27f9547c646e216ed5001227ab980e89b05c2a7b5ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vjwcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://55a18158dd67ef4667e4a6bc9684c4f4824f848b6e151a0f99c73e5ecfcc5ab1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://55a18158dd67ef4667e4a6bc9684c4f4824f848b6e151a0f99c73e5ecfcc5ab1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T12:59:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T12:59:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vjwcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://db711940b17fde8f8a00b44de8366688eb730facc06a174bf09456c12c673eb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://db711940b17fde8f8a00b44de8366688eb730facc06a174bf09456c12c673eb0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T12:59:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T12:59:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vjwcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a04ebb8ce267454f3068833092d9644e22484d6d3ea6c89e20fe396e5ea1fe9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a04ebb8ce267454f3068833092d9644e22484d6d3ea6c89e20fe396e5ea1fe9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T12:59:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T12:59:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vjwcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://771dcee276aac3571cff4769668348ea4a10d51e792d1604a47c57edac172dec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://771dcee276aac3571cff4769668348ea4a10d51e792d1604a47c57edac172dec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T12:59:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T12:59:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vjwcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c8b3f0e84cfea43942d32d84b09db7348002648527ea7d65a716489f8507eeb5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c8b3f0e84cfea43942d32d84b09db7348002648527ea7d65a716489f8507eeb5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T12:59:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T12:59:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vjwcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1271c0f5d096f2fc134e178807b44e3a5980ca67d64f0e0d73f566b348206a55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1271c0f5d096f2fc134e178807b44e3a5980ca67d64f0e0d73f566b348206a55\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T12:59:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T12:59:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vjwcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T12:59:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-829dv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:46Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:46 crc kubenswrapper[4724]: I1002 12:59:46.713786 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 12:59:46 crc kubenswrapper[4724]: I1002 12:59:46.713816 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 12:59:46 crc kubenswrapper[4724]: I1002 12:59:46.713827 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 12:59:46 crc kubenswrapper[4724]: I1002 12:59:46.713844 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 12:59:46 crc kubenswrapper[4724]: I1002 12:59:46.713859 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T12:59:46Z","lastTransitionTime":"2025-10-02T12:59:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 12:59:46 crc kubenswrapper[4724]: I1002 12:59:46.723571 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-74k4t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6090eaa-c182-4788-950c-16352c271233\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a3396cd890f4ef8af351a4a3065c2324c20bddc7e079bb78e5a02e9fc8269c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fb6lr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4d5dbca156adf53ca2d3b237e15b6dd4387249e73dbf1a7fb3e7e39c53d1b548\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fb6lr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T12:59:21Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-74k4t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:46Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:46 crc kubenswrapper[4724]: I1002 12:59:46.727907 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 12:59:46 crc kubenswrapper[4724]: I1002 12:59:46.728048 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 12:59:46 crc kubenswrapper[4724]: I1002 12:59:46.728109 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 12:59:46 crc kubenswrapper[4724]: I1002 12:59:46.728169 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 12:59:46 crc kubenswrapper[4724]: I1002 12:59:46.728225 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T12:59:46Z","lastTransitionTime":"2025-10-02T12:59:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 12:59:46 crc kubenswrapper[4724]: I1002 12:59:46.736353 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-q7t2t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"32e04071-6b34-4fc0-9783-f346a72fcf99\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpmhc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpmhc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T12:59:35Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-q7t2t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:46Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:46 crc kubenswrapper[4724]: E1002 12:59:46.747083 4724 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T12:59:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T12:59:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:46Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T12:59:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T12:59:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:46Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"1650a7ad-7086-4fb1-9f9b-b9368db16424\\\",\\\"systemUUID\\\":\\\"5e560baf-345b-4d65-984c-1cfbf6a74dd2\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:46Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:46 crc kubenswrapper[4724]: I1002 12:59:46.751645 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 12:59:46 crc kubenswrapper[4724]: I1002 12:59:46.751713 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 12:59:46 crc kubenswrapper[4724]: I1002 12:59:46.751728 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 12:59:46 crc kubenswrapper[4724]: I1002 12:59:46.751750 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 12:59:46 crc kubenswrapper[4724]: I1002 12:59:46.751762 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T12:59:46Z","lastTransitionTime":"2025-10-02T12:59:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 12:59:46 crc kubenswrapper[4724]: I1002 12:59:46.761795 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1527f3c8-7fa1-404b-aafd-7cac512df49d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:58:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:58:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:58:56Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:58:56Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:58:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f058a4a1cf35b7f4800f872139322f2b0096d67548945ba03ecfe6775ffd5103\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:58:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c06801afff9aa03ae9e0bcd8a966f454d81fdfe876806eed22e9d93132989dae\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:58:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff4c45bdd946a4d50c7dc93752d3d1a71b10b830611b64e138a5e1c0eb9e5e30\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:58:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://70f8bb8f5d842643df282ff4c021f3aeba0de7dddc0864e3c1deaf64e604fccc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a3585f00b9d5a19821986d13b0d46f0db3bdd6eef3b4e28f3a63449e0aa0e55\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-02T12:59:19Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1002 12:59:13.975124 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1002 12:59:13.976423 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1894517521/tls.crt::/tmp/serving-cert-1894517521/tls.key\\\\\\\"\\\\nI1002 12:59:19.332110 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1002 12:59:19.335107 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1002 12:59:19.335132 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1002 12:59:19.335161 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1002 12:59:19.335167 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1002 12:59:19.339404 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1002 12:59:19.339433 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 12:59:19.339437 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 12:59:19.339442 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1002 12:59:19.339446 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1002 12:59:19.339452 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1002 12:59:19.339454 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1002 12:59:19.339646 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1002 12:59:19.342508 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T12:59:02Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://25e74bd1660931f7bb8dc824f985c539abcbd4c706a11edf7192876b8796e751\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:58:59Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://366ed0b9a29ba0b7606888087909b51b39abd60dd6b11e6509b8613913461b6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://366ed0b9a29ba0b7606888087909b51b39abd60dd6b11e6509b8613913461b6b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T12:58:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T12:58:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T12:58:56Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:46Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:46 crc kubenswrapper[4724]: E1002 12:59:46.766855 4724 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T12:59:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T12:59:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:46Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T12:59:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T12:59:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:46Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"1650a7ad-7086-4fb1-9f9b-b9368db16424\\\",\\\"systemUUID\\\":\\\"5e560baf-345b-4d65-984c-1cfbf6a74dd2\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:46Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:46 crc kubenswrapper[4724]: I1002 12:59:46.772375 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 12:59:46 crc kubenswrapper[4724]: I1002 12:59:46.772433 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 12:59:46 crc kubenswrapper[4724]: I1002 12:59:46.772453 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 12:59:46 crc kubenswrapper[4724]: I1002 12:59:46.772477 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 12:59:46 crc kubenswrapper[4724]: I1002 12:59:46.772493 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T12:59:46Z","lastTransitionTime":"2025-10-02T12:59:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 12:59:46 crc kubenswrapper[4724]: I1002 12:59:46.784949 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-w58lt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4089ad23-969c-4222-a8ed-e141ec291e80\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://197d101ac60dc4e22bd91f0bccaf0c5d9461d2bb94425b5c6d4ca75aad0f7e25\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sxqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3940d1762b165816101228f10689c4897ca3909b295c51368935e2d28d32e9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sxqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://300ea92580dcf51b685b378397b0dd067899d424f4394bce09a8242415acba64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sxqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6bc33e00e91c1177fe2762cb5295c78f4b00c1861d44446269e2e7668b25ac83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sxqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f4412d5281a9c03ea5b251d17371e363b9c5a970bad0db12adf3f75ddd1f955\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sxqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4186fbd2d282edbe581a4d8d3616dff86c3e6cdbe797999fa57f6034870e7742\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sxqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b0f2abd22b65316a586899f71ac99b5451e97b96ff0c19ac2aabe1bb9002e7cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b0f2abd22b65316a586899f71ac99b5451e97b96ff0c19ac2aabe1bb9002e7cf\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-02T12:59:34Z\\\",\\\"message\\\":\\\"nfig-operator_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-machine-config-operator/machine-config-operator\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.4.183\\\\\\\", Port:9001, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI1002 12:59:33.608180 6213 obj_retry.go:386] Retry successful for *v1.Pod openshift-ovn-kubernetes/ovnkube-node-w58lt after 0 failed attempt(s)\\\\nI1002 12:59:33.609167 6213 services_controller.go:452] Built service openshift-machine-config-operator/machine-config-operator per-node LB for network=default: []services.LB{}\\\\nI1002 12:59:33.609171 6213 default_network_controller.go:776] Recording success event on pod openshift-ovn-kubernetes/ovnkube-node-w58lt\\\\nF1002 12:59:33.608123 6213 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, hand\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T12:59:32Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-w58lt_openshift-ovn-kubernetes(4089ad23-969c-4222-a8ed-e141ec291e80)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sxqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c6cd68ea9e76b6dcb1649fafb808953b30389fcaee674b611c655c239a36e7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sxqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9c7841b20007f0436754e669fb6f1a85de1f0ced1c0b2686ae0a20b56ab878f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9c7841b20007f0436754e669fb6f1a85de1f0ced1c0b2686ae0a20b56ab878f2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T12:59:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T12:59:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sxqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T12:59:21Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-w58lt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:46Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:46 crc kubenswrapper[4724]: E1002 12:59:46.787371 4724 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T12:59:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T12:59:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:46Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T12:59:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T12:59:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:46Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"1650a7ad-7086-4fb1-9f9b-b9368db16424\\\",\\\"systemUUID\\\":\\\"5e560baf-345b-4d65-984c-1cfbf6a74dd2\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:46Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:46 crc kubenswrapper[4724]: I1002 12:59:46.791231 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 12:59:46 crc kubenswrapper[4724]: I1002 12:59:46.791258 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 12:59:46 crc kubenswrapper[4724]: I1002 12:59:46.791268 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 12:59:46 crc kubenswrapper[4724]: I1002 12:59:46.791284 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 12:59:46 crc kubenswrapper[4724]: I1002 12:59:46.791296 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T12:59:46Z","lastTransitionTime":"2025-10-02T12:59:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 12:59:46 crc kubenswrapper[4724]: E1002 12:59:46.803227 4724 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T12:59:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T12:59:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:46Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T12:59:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T12:59:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:46Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"1650a7ad-7086-4fb1-9f9b-b9368db16424\\\",\\\"systemUUID\\\":\\\"5e560baf-345b-4d65-984c-1cfbf6a74dd2\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:46Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:46 crc kubenswrapper[4724]: I1002 12:59:46.803491 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8902618f9e9502ff68d0658e199abb68025c551774c41022ed9f3f15dcccba47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:46Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:46 crc kubenswrapper[4724]: I1002 12:59:46.808354 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 12:59:46 crc kubenswrapper[4724]: I1002 12:59:46.808414 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 12:59:46 crc kubenswrapper[4724]: I1002 12:59:46.808425 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 12:59:46 crc kubenswrapper[4724]: I1002 12:59:46.808446 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 12:59:46 crc kubenswrapper[4724]: I1002 12:59:46.808461 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T12:59:46Z","lastTransitionTime":"2025-10-02T12:59:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 12:59:46 crc kubenswrapper[4724]: I1002 12:59:46.822560 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:46Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:46 crc kubenswrapper[4724]: E1002 12:59:46.823715 4724 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T12:59:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T12:59:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:46Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T12:59:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T12:59:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:46Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"1650a7ad-7086-4fb1-9f9b-b9368db16424\\\",\\\"systemUUID\\\":\\\"5e560baf-345b-4d65-984c-1cfbf6a74dd2\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:46Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:46 crc kubenswrapper[4724]: E1002 12:59:46.823882 4724 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Oct 02 12:59:46 crc kubenswrapper[4724]: I1002 12:59:46.826557 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 12:59:46 crc kubenswrapper[4724]: I1002 12:59:46.826610 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 12:59:46 crc kubenswrapper[4724]: I1002 12:59:46.826621 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 12:59:46 crc kubenswrapper[4724]: I1002 12:59:46.826644 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 12:59:46 crc kubenswrapper[4724]: I1002 12:59:46.826657 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T12:59:46Z","lastTransitionTime":"2025-10-02T12:59:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 12:59:46 crc kubenswrapper[4724]: I1002 12:59:46.838143 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:19Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:46Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:46 crc kubenswrapper[4724]: I1002 12:59:46.855012 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-pr276" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c67e3c27-01fd-47f5-a7fb-a2ae1e7753d4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5fe9713cb9b004911140ef4802a6755a4d2a781fde439d7318ace93e4b608cef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c844q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T12:59:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-pr276\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:46Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:46 crc kubenswrapper[4724]: I1002 12:59:46.866904 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-vv7gr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"58e6c559-83ff-48ec-b337-ddd00852bc3c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://540350e6cb29bb395f31338d08626d3b110cc3adbe0df070045d178281d9c035\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4spmh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T12:59:26Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-vv7gr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:46Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:46 crc kubenswrapper[4724]: I1002 12:59:46.882365 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-z9692" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aeccba6f-b4bb-4dd5-8ab1-798d7a67251a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://116acf2f64bac5bb00b87c2bf09f16dcec811d4effbacb2ad49e4d2656180b54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znd9t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5971fadd3a2831fd568f1a0e53cefad9dbb4a9c5ad719b4024ec27381842f92e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znd9t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T12:59:33Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-z9692\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:46Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:46 crc kubenswrapper[4724]: I1002 12:59:46.929680 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 12:59:46 crc kubenswrapper[4724]: I1002 12:59:46.929894 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 12:59:46 crc kubenswrapper[4724]: I1002 12:59:46.929910 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 12:59:46 crc kubenswrapper[4724]: I1002 12:59:46.929929 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 12:59:46 crc kubenswrapper[4724]: I1002 12:59:46.929942 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T12:59:46Z","lastTransitionTime":"2025-10-02T12:59:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 12:59:47 crc kubenswrapper[4724]: I1002 12:59:47.032478 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 12:59:47 crc kubenswrapper[4724]: I1002 12:59:47.032530 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 12:59:47 crc kubenswrapper[4724]: I1002 12:59:47.032579 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 12:59:47 crc kubenswrapper[4724]: I1002 12:59:47.032606 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 12:59:47 crc kubenswrapper[4724]: I1002 12:59:47.032624 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T12:59:47Z","lastTransitionTime":"2025-10-02T12:59:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 12:59:47 crc kubenswrapper[4724]: I1002 12:59:47.134908 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 12:59:47 crc kubenswrapper[4724]: I1002 12:59:47.134960 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 12:59:47 crc kubenswrapper[4724]: I1002 12:59:47.134973 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 12:59:47 crc kubenswrapper[4724]: I1002 12:59:47.134990 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 12:59:47 crc kubenswrapper[4724]: I1002 12:59:47.135002 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T12:59:47Z","lastTransitionTime":"2025-10-02T12:59:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 12:59:47 crc kubenswrapper[4724]: I1002 12:59:47.237642 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 12:59:47 crc kubenswrapper[4724]: I1002 12:59:47.237696 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 12:59:47 crc kubenswrapper[4724]: I1002 12:59:47.237716 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 12:59:47 crc kubenswrapper[4724]: I1002 12:59:47.237741 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 12:59:47 crc kubenswrapper[4724]: I1002 12:59:47.237760 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T12:59:47Z","lastTransitionTime":"2025-10-02T12:59:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 12:59:47 crc kubenswrapper[4724]: I1002 12:59:47.313205 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 12:59:47 crc kubenswrapper[4724]: I1002 12:59:47.313348 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 12:59:47 crc kubenswrapper[4724]: I1002 12:59:47.313408 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-q7t2t" Oct 02 12:59:47 crc kubenswrapper[4724]: E1002 12:59:47.313466 4724 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 02 12:59:47 crc kubenswrapper[4724]: I1002 12:59:47.313503 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 12:59:47 crc kubenswrapper[4724]: E1002 12:59:47.313697 4724 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 02 12:59:47 crc kubenswrapper[4724]: E1002 12:59:47.313860 4724 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-q7t2t" podUID="32e04071-6b34-4fc0-9783-f346a72fcf99" Oct 02 12:59:47 crc kubenswrapper[4724]: E1002 12:59:47.313999 4724 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 02 12:59:47 crc kubenswrapper[4724]: I1002 12:59:47.340311 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 12:59:47 crc kubenswrapper[4724]: I1002 12:59:47.340349 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 12:59:47 crc kubenswrapper[4724]: I1002 12:59:47.340360 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 12:59:47 crc kubenswrapper[4724]: I1002 12:59:47.340377 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 12:59:47 crc kubenswrapper[4724]: I1002 12:59:47.340390 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T12:59:47Z","lastTransitionTime":"2025-10-02T12:59:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 12:59:47 crc kubenswrapper[4724]: I1002 12:59:47.442992 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 12:59:47 crc kubenswrapper[4724]: I1002 12:59:47.443075 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 12:59:47 crc kubenswrapper[4724]: I1002 12:59:47.443099 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 12:59:47 crc kubenswrapper[4724]: I1002 12:59:47.443134 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 12:59:47 crc kubenswrapper[4724]: I1002 12:59:47.443170 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T12:59:47Z","lastTransitionTime":"2025-10-02T12:59:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 12:59:47 crc kubenswrapper[4724]: I1002 12:59:47.546919 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 12:59:47 crc kubenswrapper[4724]: I1002 12:59:47.546980 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 12:59:47 crc kubenswrapper[4724]: I1002 12:59:47.546991 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 12:59:47 crc kubenswrapper[4724]: I1002 12:59:47.547009 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 12:59:47 crc kubenswrapper[4724]: I1002 12:59:47.547020 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T12:59:47Z","lastTransitionTime":"2025-10-02T12:59:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 12:59:47 crc kubenswrapper[4724]: I1002 12:59:47.650727 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 12:59:47 crc kubenswrapper[4724]: I1002 12:59:47.650787 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 12:59:47 crc kubenswrapper[4724]: I1002 12:59:47.650803 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 12:59:47 crc kubenswrapper[4724]: I1002 12:59:47.650826 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 12:59:47 crc kubenswrapper[4724]: I1002 12:59:47.650844 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T12:59:47Z","lastTransitionTime":"2025-10-02T12:59:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 12:59:47 crc kubenswrapper[4724]: I1002 12:59:47.754095 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 12:59:47 crc kubenswrapper[4724]: I1002 12:59:47.754148 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 12:59:47 crc kubenswrapper[4724]: I1002 12:59:47.754162 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 12:59:47 crc kubenswrapper[4724]: I1002 12:59:47.754184 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 12:59:47 crc kubenswrapper[4724]: I1002 12:59:47.754198 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T12:59:47Z","lastTransitionTime":"2025-10-02T12:59:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 12:59:47 crc kubenswrapper[4724]: I1002 12:59:47.856645 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 12:59:47 crc kubenswrapper[4724]: I1002 12:59:47.856694 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 12:59:47 crc kubenswrapper[4724]: I1002 12:59:47.856707 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 12:59:47 crc kubenswrapper[4724]: I1002 12:59:47.856726 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 12:59:47 crc kubenswrapper[4724]: I1002 12:59:47.856738 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T12:59:47Z","lastTransitionTime":"2025-10-02T12:59:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 12:59:47 crc kubenswrapper[4724]: I1002 12:59:47.959321 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 12:59:47 crc kubenswrapper[4724]: I1002 12:59:47.959361 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 12:59:47 crc kubenswrapper[4724]: I1002 12:59:47.959369 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 12:59:47 crc kubenswrapper[4724]: I1002 12:59:47.959384 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 12:59:47 crc kubenswrapper[4724]: I1002 12:59:47.959395 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T12:59:47Z","lastTransitionTime":"2025-10-02T12:59:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 12:59:48 crc kubenswrapper[4724]: I1002 12:59:48.062302 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 12:59:48 crc kubenswrapper[4724]: I1002 12:59:48.062371 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 12:59:48 crc kubenswrapper[4724]: I1002 12:59:48.062384 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 12:59:48 crc kubenswrapper[4724]: I1002 12:59:48.062402 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 12:59:48 crc kubenswrapper[4724]: I1002 12:59:48.062417 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T12:59:48Z","lastTransitionTime":"2025-10-02T12:59:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 12:59:48 crc kubenswrapper[4724]: I1002 12:59:48.169900 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 12:59:48 crc kubenswrapper[4724]: I1002 12:59:48.169964 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 12:59:48 crc kubenswrapper[4724]: I1002 12:59:48.169977 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 12:59:48 crc kubenswrapper[4724]: I1002 12:59:48.170000 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 12:59:48 crc kubenswrapper[4724]: I1002 12:59:48.170013 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T12:59:48Z","lastTransitionTime":"2025-10-02T12:59:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 12:59:48 crc kubenswrapper[4724]: I1002 12:59:48.272926 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 12:59:48 crc kubenswrapper[4724]: I1002 12:59:48.272983 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 12:59:48 crc kubenswrapper[4724]: I1002 12:59:48.272993 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 12:59:48 crc kubenswrapper[4724]: I1002 12:59:48.273012 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 12:59:48 crc kubenswrapper[4724]: I1002 12:59:48.273023 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T12:59:48Z","lastTransitionTime":"2025-10-02T12:59:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 12:59:48 crc kubenswrapper[4724]: I1002 12:59:48.376127 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 12:59:48 crc kubenswrapper[4724]: I1002 12:59:48.376182 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 12:59:48 crc kubenswrapper[4724]: I1002 12:59:48.376202 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 12:59:48 crc kubenswrapper[4724]: I1002 12:59:48.376228 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 12:59:48 crc kubenswrapper[4724]: I1002 12:59:48.376247 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T12:59:48Z","lastTransitionTime":"2025-10-02T12:59:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 12:59:48 crc kubenswrapper[4724]: I1002 12:59:48.479590 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 12:59:48 crc kubenswrapper[4724]: I1002 12:59:48.479651 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 12:59:48 crc kubenswrapper[4724]: I1002 12:59:48.479663 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 12:59:48 crc kubenswrapper[4724]: I1002 12:59:48.479684 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 12:59:48 crc kubenswrapper[4724]: I1002 12:59:48.479697 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T12:59:48Z","lastTransitionTime":"2025-10-02T12:59:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 12:59:48 crc kubenswrapper[4724]: I1002 12:59:48.583503 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 12:59:48 crc kubenswrapper[4724]: I1002 12:59:48.583594 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 12:59:48 crc kubenswrapper[4724]: I1002 12:59:48.583614 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 12:59:48 crc kubenswrapper[4724]: I1002 12:59:48.583640 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 12:59:48 crc kubenswrapper[4724]: I1002 12:59:48.583656 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T12:59:48Z","lastTransitionTime":"2025-10-02T12:59:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 12:59:48 crc kubenswrapper[4724]: I1002 12:59:48.687014 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 12:59:48 crc kubenswrapper[4724]: I1002 12:59:48.687082 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 12:59:48 crc kubenswrapper[4724]: I1002 12:59:48.687096 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 12:59:48 crc kubenswrapper[4724]: I1002 12:59:48.687114 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 12:59:48 crc kubenswrapper[4724]: I1002 12:59:48.687128 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T12:59:48Z","lastTransitionTime":"2025-10-02T12:59:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 12:59:48 crc kubenswrapper[4724]: I1002 12:59:48.790682 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 12:59:48 crc kubenswrapper[4724]: I1002 12:59:48.790755 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 12:59:48 crc kubenswrapper[4724]: I1002 12:59:48.790766 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 12:59:48 crc kubenswrapper[4724]: I1002 12:59:48.790786 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 12:59:48 crc kubenswrapper[4724]: I1002 12:59:48.790816 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T12:59:48Z","lastTransitionTime":"2025-10-02T12:59:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 12:59:48 crc kubenswrapper[4724]: I1002 12:59:48.894626 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 12:59:48 crc kubenswrapper[4724]: I1002 12:59:48.894678 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 12:59:48 crc kubenswrapper[4724]: I1002 12:59:48.894688 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 12:59:48 crc kubenswrapper[4724]: I1002 12:59:48.894705 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 12:59:48 crc kubenswrapper[4724]: I1002 12:59:48.894716 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T12:59:48Z","lastTransitionTime":"2025-10-02T12:59:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 12:59:49 crc kubenswrapper[4724]: I1002 12:59:49.000886 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 12:59:49 crc kubenswrapper[4724]: I1002 12:59:49.000999 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 12:59:49 crc kubenswrapper[4724]: I1002 12:59:49.001065 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 12:59:49 crc kubenswrapper[4724]: I1002 12:59:49.001100 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 12:59:49 crc kubenswrapper[4724]: I1002 12:59:49.001126 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T12:59:49Z","lastTransitionTime":"2025-10-02T12:59:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 12:59:49 crc kubenswrapper[4724]: I1002 12:59:49.104170 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 12:59:49 crc kubenswrapper[4724]: I1002 12:59:49.104227 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 12:59:49 crc kubenswrapper[4724]: I1002 12:59:49.104239 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 12:59:49 crc kubenswrapper[4724]: I1002 12:59:49.104253 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 12:59:49 crc kubenswrapper[4724]: I1002 12:59:49.104266 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T12:59:49Z","lastTransitionTime":"2025-10-02T12:59:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 12:59:49 crc kubenswrapper[4724]: I1002 12:59:49.214944 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 12:59:49 crc kubenswrapper[4724]: I1002 12:59:49.215020 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 12:59:49 crc kubenswrapper[4724]: I1002 12:59:49.215039 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 12:59:49 crc kubenswrapper[4724]: I1002 12:59:49.215117 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 12:59:49 crc kubenswrapper[4724]: I1002 12:59:49.215170 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T12:59:49Z","lastTransitionTime":"2025-10-02T12:59:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 12:59:49 crc kubenswrapper[4724]: I1002 12:59:49.313255 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 12:59:49 crc kubenswrapper[4724]: I1002 12:59:49.313425 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 12:59:49 crc kubenswrapper[4724]: I1002 12:59:49.313481 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-q7t2t" Oct 02 12:59:49 crc kubenswrapper[4724]: E1002 12:59:49.313711 4724 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 02 12:59:49 crc kubenswrapper[4724]: I1002 12:59:49.313738 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 12:59:49 crc kubenswrapper[4724]: E1002 12:59:49.313883 4724 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 02 12:59:49 crc kubenswrapper[4724]: E1002 12:59:49.313986 4724 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-q7t2t" podUID="32e04071-6b34-4fc0-9783-f346a72fcf99" Oct 02 12:59:49 crc kubenswrapper[4724]: E1002 12:59:49.314061 4724 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 02 12:59:49 crc kubenswrapper[4724]: I1002 12:59:49.318589 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 12:59:49 crc kubenswrapper[4724]: I1002 12:59:49.318643 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 12:59:49 crc kubenswrapper[4724]: I1002 12:59:49.318657 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 12:59:49 crc kubenswrapper[4724]: I1002 12:59:49.318678 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 12:59:49 crc kubenswrapper[4724]: I1002 12:59:49.318692 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T12:59:49Z","lastTransitionTime":"2025-10-02T12:59:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 12:59:49 crc kubenswrapper[4724]: I1002 12:59:49.421938 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 12:59:49 crc kubenswrapper[4724]: I1002 12:59:49.421983 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 12:59:49 crc kubenswrapper[4724]: I1002 12:59:49.421995 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 12:59:49 crc kubenswrapper[4724]: I1002 12:59:49.422010 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 12:59:49 crc kubenswrapper[4724]: I1002 12:59:49.422021 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T12:59:49Z","lastTransitionTime":"2025-10-02T12:59:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 12:59:49 crc kubenswrapper[4724]: I1002 12:59:49.527825 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 12:59:49 crc kubenswrapper[4724]: I1002 12:59:49.527896 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 12:59:49 crc kubenswrapper[4724]: I1002 12:59:49.527919 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 12:59:49 crc kubenswrapper[4724]: I1002 12:59:49.527946 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 12:59:49 crc kubenswrapper[4724]: I1002 12:59:49.527962 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T12:59:49Z","lastTransitionTime":"2025-10-02T12:59:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 12:59:49 crc kubenswrapper[4724]: I1002 12:59:49.630899 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 12:59:49 crc kubenswrapper[4724]: I1002 12:59:49.630940 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 12:59:49 crc kubenswrapper[4724]: I1002 12:59:49.630949 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 12:59:49 crc kubenswrapper[4724]: I1002 12:59:49.630967 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 12:59:49 crc kubenswrapper[4724]: I1002 12:59:49.630977 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T12:59:49Z","lastTransitionTime":"2025-10-02T12:59:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 12:59:49 crc kubenswrapper[4724]: I1002 12:59:49.733143 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 12:59:49 crc kubenswrapper[4724]: I1002 12:59:49.733187 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 12:59:49 crc kubenswrapper[4724]: I1002 12:59:49.733198 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 12:59:49 crc kubenswrapper[4724]: I1002 12:59:49.733217 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 12:59:49 crc kubenswrapper[4724]: I1002 12:59:49.733229 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T12:59:49Z","lastTransitionTime":"2025-10-02T12:59:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 12:59:49 crc kubenswrapper[4724]: I1002 12:59:49.835737 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 12:59:49 crc kubenswrapper[4724]: I1002 12:59:49.835816 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 12:59:49 crc kubenswrapper[4724]: I1002 12:59:49.835834 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 12:59:49 crc kubenswrapper[4724]: I1002 12:59:49.835860 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 12:59:49 crc kubenswrapper[4724]: I1002 12:59:49.835883 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T12:59:49Z","lastTransitionTime":"2025-10-02T12:59:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 12:59:49 crc kubenswrapper[4724]: I1002 12:59:49.938976 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 12:59:49 crc kubenswrapper[4724]: I1002 12:59:49.939022 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 12:59:49 crc kubenswrapper[4724]: I1002 12:59:49.939031 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 12:59:49 crc kubenswrapper[4724]: I1002 12:59:49.939046 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 12:59:49 crc kubenswrapper[4724]: I1002 12:59:49.939056 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T12:59:49Z","lastTransitionTime":"2025-10-02T12:59:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 12:59:50 crc kubenswrapper[4724]: I1002 12:59:50.047848 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 12:59:50 crc kubenswrapper[4724]: I1002 12:59:50.047925 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 12:59:50 crc kubenswrapper[4724]: I1002 12:59:50.047940 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 12:59:50 crc kubenswrapper[4724]: I1002 12:59:50.047962 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 12:59:50 crc kubenswrapper[4724]: I1002 12:59:50.047975 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T12:59:50Z","lastTransitionTime":"2025-10-02T12:59:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 12:59:50 crc kubenswrapper[4724]: I1002 12:59:50.150688 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 12:59:50 crc kubenswrapper[4724]: I1002 12:59:50.150725 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 12:59:50 crc kubenswrapper[4724]: I1002 12:59:50.150734 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 12:59:50 crc kubenswrapper[4724]: I1002 12:59:50.150749 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 12:59:50 crc kubenswrapper[4724]: I1002 12:59:50.150762 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T12:59:50Z","lastTransitionTime":"2025-10-02T12:59:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 12:59:50 crc kubenswrapper[4724]: I1002 12:59:50.254330 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 12:59:50 crc kubenswrapper[4724]: I1002 12:59:50.254393 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 12:59:50 crc kubenswrapper[4724]: I1002 12:59:50.254411 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 12:59:50 crc kubenswrapper[4724]: I1002 12:59:50.254438 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 12:59:50 crc kubenswrapper[4724]: I1002 12:59:50.254457 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T12:59:50Z","lastTransitionTime":"2025-10-02T12:59:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 12:59:50 crc kubenswrapper[4724]: I1002 12:59:50.357318 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 12:59:50 crc kubenswrapper[4724]: I1002 12:59:50.357358 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 12:59:50 crc kubenswrapper[4724]: I1002 12:59:50.357370 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 12:59:50 crc kubenswrapper[4724]: I1002 12:59:50.357391 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 12:59:50 crc kubenswrapper[4724]: I1002 12:59:50.357410 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T12:59:50Z","lastTransitionTime":"2025-10-02T12:59:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 12:59:50 crc kubenswrapper[4724]: I1002 12:59:50.461118 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 12:59:50 crc kubenswrapper[4724]: I1002 12:59:50.461654 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 12:59:50 crc kubenswrapper[4724]: I1002 12:59:50.461666 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 12:59:50 crc kubenswrapper[4724]: I1002 12:59:50.461684 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 12:59:50 crc kubenswrapper[4724]: I1002 12:59:50.461695 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T12:59:50Z","lastTransitionTime":"2025-10-02T12:59:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 12:59:50 crc kubenswrapper[4724]: I1002 12:59:50.564667 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 12:59:50 crc kubenswrapper[4724]: I1002 12:59:50.564734 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 12:59:50 crc kubenswrapper[4724]: I1002 12:59:50.564745 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 12:59:50 crc kubenswrapper[4724]: I1002 12:59:50.564762 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 12:59:50 crc kubenswrapper[4724]: I1002 12:59:50.564771 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T12:59:50Z","lastTransitionTime":"2025-10-02T12:59:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 12:59:50 crc kubenswrapper[4724]: I1002 12:59:50.669077 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 12:59:50 crc kubenswrapper[4724]: I1002 12:59:50.669141 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 12:59:50 crc kubenswrapper[4724]: I1002 12:59:50.669164 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 12:59:50 crc kubenswrapper[4724]: I1002 12:59:50.669194 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 12:59:50 crc kubenswrapper[4724]: I1002 12:59:50.669218 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T12:59:50Z","lastTransitionTime":"2025-10-02T12:59:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 12:59:50 crc kubenswrapper[4724]: I1002 12:59:50.772491 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 12:59:50 crc kubenswrapper[4724]: I1002 12:59:50.772656 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 12:59:50 crc kubenswrapper[4724]: I1002 12:59:50.772676 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 12:59:50 crc kubenswrapper[4724]: I1002 12:59:50.772701 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 12:59:50 crc kubenswrapper[4724]: I1002 12:59:50.772713 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T12:59:50Z","lastTransitionTime":"2025-10-02T12:59:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 12:59:50 crc kubenswrapper[4724]: I1002 12:59:50.875868 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 12:59:50 crc kubenswrapper[4724]: I1002 12:59:50.875918 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 12:59:50 crc kubenswrapper[4724]: I1002 12:59:50.875929 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 12:59:50 crc kubenswrapper[4724]: I1002 12:59:50.875945 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 12:59:50 crc kubenswrapper[4724]: I1002 12:59:50.875957 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T12:59:50Z","lastTransitionTime":"2025-10-02T12:59:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 12:59:50 crc kubenswrapper[4724]: I1002 12:59:50.980007 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 12:59:50 crc kubenswrapper[4724]: I1002 12:59:50.980071 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 12:59:50 crc kubenswrapper[4724]: I1002 12:59:50.980088 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 12:59:50 crc kubenswrapper[4724]: I1002 12:59:50.980114 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 12:59:50 crc kubenswrapper[4724]: I1002 12:59:50.980131 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T12:59:50Z","lastTransitionTime":"2025-10-02T12:59:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 12:59:51 crc kubenswrapper[4724]: I1002 12:59:51.079466 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 12:59:51 crc kubenswrapper[4724]: E1002 12:59:51.079750 4724 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 13:00:23.079692582 +0000 UTC m=+87.534451763 (durationBeforeRetry 32s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 12:59:51 crc kubenswrapper[4724]: I1002 12:59:51.079964 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 12:59:51 crc kubenswrapper[4724]: E1002 12:59:51.080145 4724 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Oct 02 12:59:51 crc kubenswrapper[4724]: E1002 12:59:51.080250 4724 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-02 13:00:23.080224995 +0000 UTC m=+87.534984156 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Oct 02 12:59:51 crc kubenswrapper[4724]: I1002 12:59:51.090500 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 12:59:51 crc kubenswrapper[4724]: I1002 12:59:51.090626 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 12:59:51 crc kubenswrapper[4724]: I1002 12:59:51.090652 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 12:59:51 crc kubenswrapper[4724]: I1002 12:59:51.090688 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 12:59:51 crc kubenswrapper[4724]: I1002 12:59:51.090710 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T12:59:51Z","lastTransitionTime":"2025-10-02T12:59:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 12:59:51 crc kubenswrapper[4724]: I1002 12:59:51.180823 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 12:59:51 crc kubenswrapper[4724]: I1002 12:59:51.180914 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 12:59:51 crc kubenswrapper[4724]: I1002 12:59:51.180939 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 12:59:51 crc kubenswrapper[4724]: I1002 12:59:51.180971 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/32e04071-6b34-4fc0-9783-f346a72fcf99-metrics-certs\") pod \"network-metrics-daemon-q7t2t\" (UID: \"32e04071-6b34-4fc0-9783-f346a72fcf99\") " pod="openshift-multus/network-metrics-daemon-q7t2t" Oct 02 12:59:51 crc kubenswrapper[4724]: E1002 12:59:51.181113 4724 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Oct 02 12:59:51 crc kubenswrapper[4724]: E1002 12:59:51.181189 4724 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/32e04071-6b34-4fc0-9783-f346a72fcf99-metrics-certs podName:32e04071-6b34-4fc0-9783-f346a72fcf99 nodeName:}" failed. No retries permitted until 2025-10-02 13:00:07.18116927 +0000 UTC m=+71.635928391 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/32e04071-6b34-4fc0-9783-f346a72fcf99-metrics-certs") pod "network-metrics-daemon-q7t2t" (UID: "32e04071-6b34-4fc0-9783-f346a72fcf99") : object "openshift-multus"/"metrics-daemon-secret" not registered Oct 02 12:59:51 crc kubenswrapper[4724]: E1002 12:59:51.181184 4724 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 02 12:59:51 crc kubenswrapper[4724]: E1002 12:59:51.181236 4724 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 02 12:59:51 crc kubenswrapper[4724]: E1002 12:59:51.181300 4724 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 02 12:59:51 crc kubenswrapper[4724]: E1002 12:59:51.181341 4724 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 02 12:59:51 crc kubenswrapper[4724]: E1002 12:59:51.181354 4724 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 02 12:59:51 crc kubenswrapper[4724]: E1002 12:59:51.181389 4724 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-02 13:00:23.181352354 +0000 UTC m=+87.636111515 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 02 12:59:51 crc kubenswrapper[4724]: E1002 12:59:51.181428 4724 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-10-02 13:00:23.181412796 +0000 UTC m=+87.636171957 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 02 12:59:51 crc kubenswrapper[4724]: E1002 12:59:51.181309 4724 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 02 12:59:51 crc kubenswrapper[4724]: E1002 12:59:51.181466 4724 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 02 12:59:51 crc kubenswrapper[4724]: E1002 12:59:51.181595 4724 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-10-02 13:00:23.181518419 +0000 UTC m=+87.636277580 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 02 12:59:51 crc kubenswrapper[4724]: I1002 12:59:51.193031 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 12:59:51 crc kubenswrapper[4724]: I1002 12:59:51.193121 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 12:59:51 crc kubenswrapper[4724]: I1002 12:59:51.193164 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 12:59:51 crc kubenswrapper[4724]: I1002 12:59:51.193187 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 12:59:51 crc kubenswrapper[4724]: I1002 12:59:51.193198 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T12:59:51Z","lastTransitionTime":"2025-10-02T12:59:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 12:59:51 crc kubenswrapper[4724]: I1002 12:59:51.296432 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 12:59:51 crc kubenswrapper[4724]: I1002 12:59:51.296481 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 12:59:51 crc kubenswrapper[4724]: I1002 12:59:51.296493 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 12:59:51 crc kubenswrapper[4724]: I1002 12:59:51.296510 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 12:59:51 crc kubenswrapper[4724]: I1002 12:59:51.296523 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T12:59:51Z","lastTransitionTime":"2025-10-02T12:59:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 12:59:51 crc kubenswrapper[4724]: I1002 12:59:51.312757 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 12:59:51 crc kubenswrapper[4724]: I1002 12:59:51.312794 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 12:59:51 crc kubenswrapper[4724]: E1002 12:59:51.312924 4724 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 02 12:59:51 crc kubenswrapper[4724]: I1002 12:59:51.312954 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-q7t2t" Oct 02 12:59:51 crc kubenswrapper[4724]: I1002 12:59:51.312996 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 12:59:51 crc kubenswrapper[4724]: E1002 12:59:51.313354 4724 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-q7t2t" podUID="32e04071-6b34-4fc0-9783-f346a72fcf99" Oct 02 12:59:51 crc kubenswrapper[4724]: E1002 12:59:51.313430 4724 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 02 12:59:51 crc kubenswrapper[4724]: E1002 12:59:51.313530 4724 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 02 12:59:51 crc kubenswrapper[4724]: I1002 12:59:51.313861 4724 scope.go:117] "RemoveContainer" containerID="b0f2abd22b65316a586899f71ac99b5451e97b96ff0c19ac2aabe1bb9002e7cf" Oct 02 12:59:51 crc kubenswrapper[4724]: I1002 12:59:51.399284 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 12:59:51 crc kubenswrapper[4724]: I1002 12:59:51.399330 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 12:59:51 crc kubenswrapper[4724]: I1002 12:59:51.399340 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 12:59:51 crc kubenswrapper[4724]: I1002 12:59:51.399358 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 12:59:51 crc kubenswrapper[4724]: I1002 12:59:51.399369 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T12:59:51Z","lastTransitionTime":"2025-10-02T12:59:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 12:59:51 crc kubenswrapper[4724]: I1002 12:59:51.502123 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 12:59:51 crc kubenswrapper[4724]: I1002 12:59:51.502172 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 12:59:51 crc kubenswrapper[4724]: I1002 12:59:51.502186 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 12:59:51 crc kubenswrapper[4724]: I1002 12:59:51.502208 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 12:59:51 crc kubenswrapper[4724]: I1002 12:59:51.502223 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T12:59:51Z","lastTransitionTime":"2025-10-02T12:59:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 12:59:51 crc kubenswrapper[4724]: I1002 12:59:51.604769 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 12:59:51 crc kubenswrapper[4724]: I1002 12:59:51.604869 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 12:59:51 crc kubenswrapper[4724]: I1002 12:59:51.604888 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 12:59:51 crc kubenswrapper[4724]: I1002 12:59:51.604920 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 12:59:51 crc kubenswrapper[4724]: I1002 12:59:51.604939 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T12:59:51Z","lastTransitionTime":"2025-10-02T12:59:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 12:59:51 crc kubenswrapper[4724]: I1002 12:59:51.707698 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 12:59:51 crc kubenswrapper[4724]: I1002 12:59:51.707760 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 12:59:51 crc kubenswrapper[4724]: I1002 12:59:51.707778 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 12:59:51 crc kubenswrapper[4724]: I1002 12:59:51.707803 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 12:59:51 crc kubenswrapper[4724]: I1002 12:59:51.707821 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T12:59:51Z","lastTransitionTime":"2025-10-02T12:59:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 12:59:51 crc kubenswrapper[4724]: I1002 12:59:51.810450 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 12:59:51 crc kubenswrapper[4724]: I1002 12:59:51.810493 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 12:59:51 crc kubenswrapper[4724]: I1002 12:59:51.810505 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 12:59:51 crc kubenswrapper[4724]: I1002 12:59:51.810525 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 12:59:51 crc kubenswrapper[4724]: I1002 12:59:51.810565 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T12:59:51Z","lastTransitionTime":"2025-10-02T12:59:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 12:59:51 crc kubenswrapper[4724]: I1002 12:59:51.914070 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 12:59:51 crc kubenswrapper[4724]: I1002 12:59:51.914133 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 12:59:51 crc kubenswrapper[4724]: I1002 12:59:51.914152 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 12:59:51 crc kubenswrapper[4724]: I1002 12:59:51.914177 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 12:59:51 crc kubenswrapper[4724]: I1002 12:59:51.914195 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T12:59:51Z","lastTransitionTime":"2025-10-02T12:59:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 12:59:52 crc kubenswrapper[4724]: I1002 12:59:52.016964 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 12:59:52 crc kubenswrapper[4724]: I1002 12:59:52.017019 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 12:59:52 crc kubenswrapper[4724]: I1002 12:59:52.017035 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 12:59:52 crc kubenswrapper[4724]: I1002 12:59:52.017059 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 12:59:52 crc kubenswrapper[4724]: I1002 12:59:52.017077 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T12:59:52Z","lastTransitionTime":"2025-10-02T12:59:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 12:59:52 crc kubenswrapper[4724]: I1002 12:59:52.120078 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 12:59:52 crc kubenswrapper[4724]: I1002 12:59:52.120127 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 12:59:52 crc kubenswrapper[4724]: I1002 12:59:52.120150 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 12:59:52 crc kubenswrapper[4724]: I1002 12:59:52.120176 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 12:59:52 crc kubenswrapper[4724]: I1002 12:59:52.120197 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T12:59:52Z","lastTransitionTime":"2025-10-02T12:59:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 12:59:52 crc kubenswrapper[4724]: I1002 12:59:52.222670 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 12:59:52 crc kubenswrapper[4724]: I1002 12:59:52.222750 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 12:59:52 crc kubenswrapper[4724]: I1002 12:59:52.222774 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 12:59:52 crc kubenswrapper[4724]: I1002 12:59:52.222806 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 12:59:52 crc kubenswrapper[4724]: I1002 12:59:52.222827 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T12:59:52Z","lastTransitionTime":"2025-10-02T12:59:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 12:59:52 crc kubenswrapper[4724]: I1002 12:59:52.325019 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 12:59:52 crc kubenswrapper[4724]: I1002 12:59:52.325058 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 12:59:52 crc kubenswrapper[4724]: I1002 12:59:52.325072 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 12:59:52 crc kubenswrapper[4724]: I1002 12:59:52.325093 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 12:59:52 crc kubenswrapper[4724]: I1002 12:59:52.325107 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T12:59:52Z","lastTransitionTime":"2025-10-02T12:59:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 12:59:52 crc kubenswrapper[4724]: I1002 12:59:52.428115 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 12:59:52 crc kubenswrapper[4724]: I1002 12:59:52.428175 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 12:59:52 crc kubenswrapper[4724]: I1002 12:59:52.428198 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 12:59:52 crc kubenswrapper[4724]: I1002 12:59:52.428228 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 12:59:52 crc kubenswrapper[4724]: I1002 12:59:52.428248 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T12:59:52Z","lastTransitionTime":"2025-10-02T12:59:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 12:59:52 crc kubenswrapper[4724]: I1002 12:59:52.531914 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 12:59:52 crc kubenswrapper[4724]: I1002 12:59:52.531965 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 12:59:52 crc kubenswrapper[4724]: I1002 12:59:52.531975 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 12:59:52 crc kubenswrapper[4724]: I1002 12:59:52.531992 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 12:59:52 crc kubenswrapper[4724]: I1002 12:59:52.532003 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T12:59:52Z","lastTransitionTime":"2025-10-02T12:59:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 12:59:52 crc kubenswrapper[4724]: I1002 12:59:52.635223 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 12:59:52 crc kubenswrapper[4724]: I1002 12:59:52.635293 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 12:59:52 crc kubenswrapper[4724]: I1002 12:59:52.635310 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 12:59:52 crc kubenswrapper[4724]: I1002 12:59:52.635343 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 12:59:52 crc kubenswrapper[4724]: I1002 12:59:52.635366 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T12:59:52Z","lastTransitionTime":"2025-10-02T12:59:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 12:59:52 crc kubenswrapper[4724]: I1002 12:59:52.737835 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 12:59:52 crc kubenswrapper[4724]: I1002 12:59:52.737915 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 12:59:52 crc kubenswrapper[4724]: I1002 12:59:52.737935 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 12:59:52 crc kubenswrapper[4724]: I1002 12:59:52.737967 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 12:59:52 crc kubenswrapper[4724]: I1002 12:59:52.737991 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T12:59:52Z","lastTransitionTime":"2025-10-02T12:59:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 12:59:52 crc kubenswrapper[4724]: I1002 12:59:52.841060 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 12:59:52 crc kubenswrapper[4724]: I1002 12:59:52.841136 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 12:59:52 crc kubenswrapper[4724]: I1002 12:59:52.841162 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 12:59:52 crc kubenswrapper[4724]: I1002 12:59:52.841188 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 12:59:52 crc kubenswrapper[4724]: I1002 12:59:52.841205 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T12:59:52Z","lastTransitionTime":"2025-10-02T12:59:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 12:59:52 crc kubenswrapper[4724]: I1002 12:59:52.944933 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 12:59:52 crc kubenswrapper[4724]: I1002 12:59:52.945025 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 12:59:52 crc kubenswrapper[4724]: I1002 12:59:52.945041 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 12:59:52 crc kubenswrapper[4724]: I1002 12:59:52.945085 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 12:59:52 crc kubenswrapper[4724]: I1002 12:59:52.945100 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T12:59:52Z","lastTransitionTime":"2025-10-02T12:59:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 12:59:53 crc kubenswrapper[4724]: I1002 12:59:53.047861 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 12:59:53 crc kubenswrapper[4724]: I1002 12:59:53.047912 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 12:59:53 crc kubenswrapper[4724]: I1002 12:59:53.047929 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 12:59:53 crc kubenswrapper[4724]: I1002 12:59:53.047946 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 12:59:53 crc kubenswrapper[4724]: I1002 12:59:53.047959 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T12:59:53Z","lastTransitionTime":"2025-10-02T12:59:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 12:59:53 crc kubenswrapper[4724]: I1002 12:59:53.149965 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 12:59:53 crc kubenswrapper[4724]: I1002 12:59:53.150022 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 12:59:53 crc kubenswrapper[4724]: I1002 12:59:53.150038 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 12:59:53 crc kubenswrapper[4724]: I1002 12:59:53.150057 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 12:59:53 crc kubenswrapper[4724]: I1002 12:59:53.150071 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T12:59:53Z","lastTransitionTime":"2025-10-02T12:59:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 12:59:53 crc kubenswrapper[4724]: I1002 12:59:53.253290 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 12:59:53 crc kubenswrapper[4724]: I1002 12:59:53.253347 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 12:59:53 crc kubenswrapper[4724]: I1002 12:59:53.253359 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 12:59:53 crc kubenswrapper[4724]: I1002 12:59:53.253377 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 12:59:53 crc kubenswrapper[4724]: I1002 12:59:53.253390 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T12:59:53Z","lastTransitionTime":"2025-10-02T12:59:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 12:59:53 crc kubenswrapper[4724]: I1002 12:59:53.313288 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 12:59:53 crc kubenswrapper[4724]: I1002 12:59:53.313388 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-q7t2t" Oct 02 12:59:53 crc kubenswrapper[4724]: I1002 12:59:53.313448 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 12:59:53 crc kubenswrapper[4724]: E1002 12:59:53.313596 4724 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 02 12:59:53 crc kubenswrapper[4724]: I1002 12:59:53.313680 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 12:59:53 crc kubenswrapper[4724]: E1002 12:59:53.313915 4724 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-q7t2t" podUID="32e04071-6b34-4fc0-9783-f346a72fcf99" Oct 02 12:59:53 crc kubenswrapper[4724]: E1002 12:59:53.314130 4724 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 02 12:59:53 crc kubenswrapper[4724]: E1002 12:59:53.314225 4724 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 02 12:59:53 crc kubenswrapper[4724]: I1002 12:59:53.356876 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 12:59:53 crc kubenswrapper[4724]: I1002 12:59:53.356927 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 12:59:53 crc kubenswrapper[4724]: I1002 12:59:53.356936 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 12:59:53 crc kubenswrapper[4724]: I1002 12:59:53.356952 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 12:59:53 crc kubenswrapper[4724]: I1002 12:59:53.356964 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T12:59:53Z","lastTransitionTime":"2025-10-02T12:59:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 12:59:53 crc kubenswrapper[4724]: I1002 12:59:53.460166 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 12:59:53 crc kubenswrapper[4724]: I1002 12:59:53.460225 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 12:59:53 crc kubenswrapper[4724]: I1002 12:59:53.460240 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 12:59:53 crc kubenswrapper[4724]: I1002 12:59:53.460262 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 12:59:53 crc kubenswrapper[4724]: I1002 12:59:53.460277 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T12:59:53Z","lastTransitionTime":"2025-10-02T12:59:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 12:59:53 crc kubenswrapper[4724]: I1002 12:59:53.563209 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 12:59:53 crc kubenswrapper[4724]: I1002 12:59:53.563252 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 12:59:53 crc kubenswrapper[4724]: I1002 12:59:53.563262 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 12:59:53 crc kubenswrapper[4724]: I1002 12:59:53.563283 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 12:59:53 crc kubenswrapper[4724]: I1002 12:59:53.563294 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T12:59:53Z","lastTransitionTime":"2025-10-02T12:59:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 12:59:53 crc kubenswrapper[4724]: I1002 12:59:53.666208 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 12:59:53 crc kubenswrapper[4724]: I1002 12:59:53.666254 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 12:59:53 crc kubenswrapper[4724]: I1002 12:59:53.666266 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 12:59:53 crc kubenswrapper[4724]: I1002 12:59:53.666284 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 12:59:53 crc kubenswrapper[4724]: I1002 12:59:53.666295 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T12:59:53Z","lastTransitionTime":"2025-10-02T12:59:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 12:59:53 crc kubenswrapper[4724]: I1002 12:59:53.741343 4724 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-w58lt_4089ad23-969c-4222-a8ed-e141ec291e80/ovnkube-controller/1.log" Oct 02 12:59:53 crc kubenswrapper[4724]: I1002 12:59:53.744964 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-w58lt" event={"ID":"4089ad23-969c-4222-a8ed-e141ec291e80","Type":"ContainerStarted","Data":"28aaaa47f8700a0b8725c20b81e95579d97b1548baea9e9cea1fec062cd76f79"} Oct 02 12:59:53 crc kubenswrapper[4724]: I1002 12:59:53.746365 4724 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-w58lt" Oct 02 12:59:53 crc kubenswrapper[4724]: I1002 12:59:53.762286 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-z9692" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aeccba6f-b4bb-4dd5-8ab1-798d7a67251a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://116acf2f64bac5bb00b87c2bf09f16dcec811d4effbacb2ad49e4d2656180b54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znd9t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5971fadd3a2831fd568f1a0e53cefad9dbb4a9c5ad719b4024ec27381842f92e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znd9t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T12:59:33Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-z9692\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:53Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:53 crc kubenswrapper[4724]: I1002 12:59:53.770170 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 12:59:53 crc kubenswrapper[4724]: I1002 12:59:53.770232 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 12:59:53 crc kubenswrapper[4724]: I1002 12:59:53.770245 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 12:59:53 crc kubenswrapper[4724]: I1002 12:59:53.770264 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 12:59:53 crc kubenswrapper[4724]: I1002 12:59:53.770286 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T12:59:53Z","lastTransitionTime":"2025-10-02T12:59:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 12:59:53 crc kubenswrapper[4724]: I1002 12:59:53.779493 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8902618f9e9502ff68d0658e199abb68025c551774c41022ed9f3f15dcccba47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:53Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:53 crc kubenswrapper[4724]: I1002 12:59:53.800937 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:53Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:53 crc kubenswrapper[4724]: I1002 12:59:53.816974 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:19Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:53Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:53 crc kubenswrapper[4724]: I1002 12:59:53.832064 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-pr276" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c67e3c27-01fd-47f5-a7fb-a2ae1e7753d4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5fe9713cb9b004911140ef4802a6755a4d2a781fde439d7318ace93e4b608cef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c844q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T12:59:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-pr276\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:53Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:53 crc kubenswrapper[4724]: I1002 12:59:53.844735 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-vv7gr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"58e6c559-83ff-48ec-b337-ddd00852bc3c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://540350e6cb29bb395f31338d08626d3b110cc3adbe0df070045d178281d9c035\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4spmh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T12:59:26Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-vv7gr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:53Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:53 crc kubenswrapper[4724]: I1002 12:59:53.859110 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"92a0e520-2ed8-4d81-8de4-c0f98916b012\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:58:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:58:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:58:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8d0d24fc1f56963b40bca48e3e923a5b35f9bec48f0b4fabbbdc9163762b2aa6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:58:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cdf23aa5cabd12ba0af299a7184df0341b5a7f7fb6f8159a10c932599fcef08b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:58:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a087f1c52a561564ac405ef652e8cb1542077e507acff22e1189f7370b7ca322\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:58:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a675aca2218d07044b6e9a391f2b60f5cc822790011af97b0c0b704c1bd98d78\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:58:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T12:58:56Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:53Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:53 crc kubenswrapper[4724]: I1002 12:59:53.872357 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-2mrjk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e0d209f5-ad55-48f5-b4de-51aa5a972c19\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e94572c741d70917793b2482faf4c7fba57dfc4a29ade8824b35109735b602c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8f48\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T12:59:21Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-2mrjk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:53Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:53 crc kubenswrapper[4724]: I1002 12:59:53.872837 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 12:59:53 crc kubenswrapper[4724]: I1002 12:59:53.872900 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 12:59:53 crc kubenswrapper[4724]: I1002 12:59:53.872916 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 12:59:53 crc kubenswrapper[4724]: I1002 12:59:53.872938 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 12:59:53 crc kubenswrapper[4724]: I1002 12:59:53.872952 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T12:59:53Z","lastTransitionTime":"2025-10-02T12:59:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 12:59:53 crc kubenswrapper[4724]: I1002 12:59:53.884388 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-q7t2t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"32e04071-6b34-4fc0-9783-f346a72fcf99\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpmhc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpmhc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T12:59:35Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-q7t2t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:53Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:53 crc kubenswrapper[4724]: I1002 12:59:53.896060 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7cb9bd1f-850c-4e5f-bcd0-cd9cdca06604\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:58:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:58:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:58:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4c3fabef61b20063aa2de1008b2b1fda94f39ddd4d0ad905d1a954955ffff7d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:58:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d7393e9d92fc40b10b30d788707b77d3c3e28fb075c618da4c3575dc0a1c1a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:58:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7459703a380feaeff882176dc352b69d3e13089a18d5da98bcebd41018d32091\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:58:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a8e76f0ac3479d575a0b073c5b9ad857711fc06bca31e147167babe0fb164614\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a8e76f0ac3479d575a0b073c5b9ad857711fc06bca31e147167babe0fb164614\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T12:58:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T12:58:57Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T12:58:56Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:53Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:53 crc kubenswrapper[4724]: I1002 12:59:53.908879 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df76441bbfa05afa242a77e10e6f855d811a9432b790630a9fa5f359ccb6d457\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:53Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:53 crc kubenswrapper[4724]: I1002 12:59:53.923181 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://16ca55ea7e67fb6219b1c835d960d1f9ec6a18a98789ee961e44a278fad9add7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://72eefb1c1aad15d7ad34aa4384c7d0a20a4c886225cb7d28f0597cf01ad35693\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:53Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:53 crc kubenswrapper[4724]: I1002 12:59:53.941609 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:19Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:53Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:53 crc kubenswrapper[4724]: I1002 12:59:53.963990 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-829dv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b90b9e02-3565-4ad5-8f8c-eec339fc499c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://12d1d4558bef1a0a974bf27f9547c646e216ed5001227ab980e89b05c2a7b5ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vjwcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://55a18158dd67ef4667e4a6bc9684c4f4824f848b6e151a0f99c73e5ecfcc5ab1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://55a18158dd67ef4667e4a6bc9684c4f4824f848b6e151a0f99c73e5ecfcc5ab1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T12:59:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T12:59:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vjwcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://db711940b17fde8f8a00b44de8366688eb730facc06a174bf09456c12c673eb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://db711940b17fde8f8a00b44de8366688eb730facc06a174bf09456c12c673eb0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T12:59:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T12:59:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vjwcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a04ebb8ce267454f3068833092d9644e22484d6d3ea6c89e20fe396e5ea1fe9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a04ebb8ce267454f3068833092d9644e22484d6d3ea6c89e20fe396e5ea1fe9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T12:59:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T12:59:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vjwcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://771dcee276aac3571cff4769668348ea4a10d51e792d1604a47c57edac172dec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://771dcee276aac3571cff4769668348ea4a10d51e792d1604a47c57edac172dec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T12:59:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T12:59:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vjwcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c8b3f0e84cfea43942d32d84b09db7348002648527ea7d65a716489f8507eeb5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c8b3f0e84cfea43942d32d84b09db7348002648527ea7d65a716489f8507eeb5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T12:59:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T12:59:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vjwcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1271c0f5d096f2fc134e178807b44e3a5980ca67d64f0e0d73f566b348206a55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1271c0f5d096f2fc134e178807b44e3a5980ca67d64f0e0d73f566b348206a55\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T12:59:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T12:59:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vjwcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T12:59:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-829dv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:53Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:53 crc kubenswrapper[4724]: I1002 12:59:53.976246 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 12:59:53 crc kubenswrapper[4724]: I1002 12:59:53.976292 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 12:59:53 crc kubenswrapper[4724]: I1002 12:59:53.976305 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 12:59:53 crc kubenswrapper[4724]: I1002 12:59:53.976324 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 12:59:53 crc kubenswrapper[4724]: I1002 12:59:53.976336 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T12:59:53Z","lastTransitionTime":"2025-10-02T12:59:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 12:59:53 crc kubenswrapper[4724]: I1002 12:59:53.979438 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-74k4t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6090eaa-c182-4788-950c-16352c271233\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a3396cd890f4ef8af351a4a3065c2324c20bddc7e079bb78e5a02e9fc8269c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fb6lr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4d5dbca156adf53ca2d3b237e15b6dd4387249e73dbf1a7fb3e7e39c53d1b548\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fb6lr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T12:59:21Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-74k4t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:53Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:53 crc kubenswrapper[4724]: I1002 12:59:53.993717 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1527f3c8-7fa1-404b-aafd-7cac512df49d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:58:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:58:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:58:56Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:58:56Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:58:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f058a4a1cf35b7f4800f872139322f2b0096d67548945ba03ecfe6775ffd5103\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:58:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c06801afff9aa03ae9e0bcd8a966f454d81fdfe876806eed22e9d93132989dae\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:58:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff4c45bdd946a4d50c7dc93752d3d1a71b10b830611b64e138a5e1c0eb9e5e30\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:58:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://70f8bb8f5d842643df282ff4c021f3aeba0de7dddc0864e3c1deaf64e604fccc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a3585f00b9d5a19821986d13b0d46f0db3bdd6eef3b4e28f3a63449e0aa0e55\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-02T12:59:19Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1002 12:59:13.975124 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1002 12:59:13.976423 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1894517521/tls.crt::/tmp/serving-cert-1894517521/tls.key\\\\\\\"\\\\nI1002 12:59:19.332110 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1002 12:59:19.335107 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1002 12:59:19.335132 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1002 12:59:19.335161 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1002 12:59:19.335167 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1002 12:59:19.339404 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1002 12:59:19.339433 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 12:59:19.339437 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 12:59:19.339442 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1002 12:59:19.339446 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1002 12:59:19.339452 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1002 12:59:19.339454 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1002 12:59:19.339646 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1002 12:59:19.342508 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T12:59:02Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://25e74bd1660931f7bb8dc824f985c539abcbd4c706a11edf7192876b8796e751\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:58:59Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://366ed0b9a29ba0b7606888087909b51b39abd60dd6b11e6509b8613913461b6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://366ed0b9a29ba0b7606888087909b51b39abd60dd6b11e6509b8613913461b6b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T12:58:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T12:58:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T12:58:56Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:53Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:54 crc kubenswrapper[4724]: I1002 12:59:54.014301 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-w58lt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4089ad23-969c-4222-a8ed-e141ec291e80\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://197d101ac60dc4e22bd91f0bccaf0c5d9461d2bb94425b5c6d4ca75aad0f7e25\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sxqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3940d1762b165816101228f10689c4897ca3909b295c51368935e2d28d32e9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sxqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://300ea92580dcf51b685b378397b0dd067899d424f4394bce09a8242415acba64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sxqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6bc33e00e91c1177fe2762cb5295c78f4b00c1861d44446269e2e7668b25ac83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sxqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f4412d5281a9c03ea5b251d17371e363b9c5a970bad0db12adf3f75ddd1f955\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sxqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4186fbd2d282edbe581a4d8d3616dff86c3e6cdbe797999fa57f6034870e7742\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sxqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://28aaaa47f8700a0b8725c20b81e95579d97b1548baea9e9cea1fec062cd76f79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b0f2abd22b65316a586899f71ac99b5451e97b96ff0c19ac2aabe1bb9002e7cf\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-02T12:59:34Z\\\",\\\"message\\\":\\\"nfig-operator_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-machine-config-operator/machine-config-operator\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.4.183\\\\\\\", Port:9001, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI1002 12:59:33.608180 6213 obj_retry.go:386] Retry successful for *v1.Pod openshift-ovn-kubernetes/ovnkube-node-w58lt after 0 failed attempt(s)\\\\nI1002 12:59:33.609167 6213 services_controller.go:452] Built service openshift-machine-config-operator/machine-config-operator per-node LB for network=default: []services.LB{}\\\\nI1002 12:59:33.609171 6213 default_network_controller.go:776] Recording success event on pod openshift-ovn-kubernetes/ovnkube-node-w58lt\\\\nF1002 12:59:33.608123 6213 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, hand\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T12:59:32Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sxqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c6cd68ea9e76b6dcb1649fafb808953b30389fcaee674b611c655c239a36e7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sxqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9c7841b20007f0436754e669fb6f1a85de1f0ced1c0b2686ae0a20b56ab878f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9c7841b20007f0436754e669fb6f1a85de1f0ced1c0b2686ae0a20b56ab878f2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T12:59:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T12:59:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sxqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T12:59:21Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-w58lt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:54Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:54 crc kubenswrapper[4724]: I1002 12:59:54.079091 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 12:59:54 crc kubenswrapper[4724]: I1002 12:59:54.079131 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 12:59:54 crc kubenswrapper[4724]: I1002 12:59:54.079140 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 12:59:54 crc kubenswrapper[4724]: I1002 12:59:54.079157 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 12:59:54 crc kubenswrapper[4724]: I1002 12:59:54.079167 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T12:59:54Z","lastTransitionTime":"2025-10-02T12:59:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 12:59:54 crc kubenswrapper[4724]: I1002 12:59:54.182420 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 12:59:54 crc kubenswrapper[4724]: I1002 12:59:54.182494 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 12:59:54 crc kubenswrapper[4724]: I1002 12:59:54.182511 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 12:59:54 crc kubenswrapper[4724]: I1002 12:59:54.182565 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 12:59:54 crc kubenswrapper[4724]: I1002 12:59:54.182587 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T12:59:54Z","lastTransitionTime":"2025-10-02T12:59:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 12:59:54 crc kubenswrapper[4724]: I1002 12:59:54.285707 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 12:59:54 crc kubenswrapper[4724]: I1002 12:59:54.285775 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 12:59:54 crc kubenswrapper[4724]: I1002 12:59:54.285791 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 12:59:54 crc kubenswrapper[4724]: I1002 12:59:54.285818 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 12:59:54 crc kubenswrapper[4724]: I1002 12:59:54.285861 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T12:59:54Z","lastTransitionTime":"2025-10-02T12:59:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 12:59:54 crc kubenswrapper[4724]: I1002 12:59:54.389433 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 12:59:54 crc kubenswrapper[4724]: I1002 12:59:54.389488 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 12:59:54 crc kubenswrapper[4724]: I1002 12:59:54.389502 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 12:59:54 crc kubenswrapper[4724]: I1002 12:59:54.389520 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 12:59:54 crc kubenswrapper[4724]: I1002 12:59:54.389565 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T12:59:54Z","lastTransitionTime":"2025-10-02T12:59:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 12:59:54 crc kubenswrapper[4724]: I1002 12:59:54.492694 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 12:59:54 crc kubenswrapper[4724]: I1002 12:59:54.492776 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 12:59:54 crc kubenswrapper[4724]: I1002 12:59:54.492801 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 12:59:54 crc kubenswrapper[4724]: I1002 12:59:54.493026 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 12:59:54 crc kubenswrapper[4724]: I1002 12:59:54.493051 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T12:59:54Z","lastTransitionTime":"2025-10-02T12:59:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 12:59:54 crc kubenswrapper[4724]: I1002 12:59:54.597097 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 12:59:54 crc kubenswrapper[4724]: I1002 12:59:54.597167 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 12:59:54 crc kubenswrapper[4724]: I1002 12:59:54.597188 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 12:59:54 crc kubenswrapper[4724]: I1002 12:59:54.597218 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 12:59:54 crc kubenswrapper[4724]: I1002 12:59:54.597238 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T12:59:54Z","lastTransitionTime":"2025-10-02T12:59:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 12:59:54 crc kubenswrapper[4724]: I1002 12:59:54.701429 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 12:59:54 crc kubenswrapper[4724]: I1002 12:59:54.701516 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 12:59:54 crc kubenswrapper[4724]: I1002 12:59:54.701555 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 12:59:54 crc kubenswrapper[4724]: I1002 12:59:54.701577 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 12:59:54 crc kubenswrapper[4724]: I1002 12:59:54.701592 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T12:59:54Z","lastTransitionTime":"2025-10-02T12:59:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 12:59:54 crc kubenswrapper[4724]: I1002 12:59:54.751720 4724 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-w58lt_4089ad23-969c-4222-a8ed-e141ec291e80/ovnkube-controller/2.log" Oct 02 12:59:54 crc kubenswrapper[4724]: I1002 12:59:54.753006 4724 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-w58lt_4089ad23-969c-4222-a8ed-e141ec291e80/ovnkube-controller/1.log" Oct 02 12:59:54 crc kubenswrapper[4724]: I1002 12:59:54.757866 4724 generic.go:334] "Generic (PLEG): container finished" podID="4089ad23-969c-4222-a8ed-e141ec291e80" containerID="28aaaa47f8700a0b8725c20b81e95579d97b1548baea9e9cea1fec062cd76f79" exitCode=1 Oct 02 12:59:54 crc kubenswrapper[4724]: I1002 12:59:54.757958 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-w58lt" event={"ID":"4089ad23-969c-4222-a8ed-e141ec291e80","Type":"ContainerDied","Data":"28aaaa47f8700a0b8725c20b81e95579d97b1548baea9e9cea1fec062cd76f79"} Oct 02 12:59:54 crc kubenswrapper[4724]: I1002 12:59:54.758021 4724 scope.go:117] "RemoveContainer" containerID="b0f2abd22b65316a586899f71ac99b5451e97b96ff0c19ac2aabe1bb9002e7cf" Oct 02 12:59:54 crc kubenswrapper[4724]: I1002 12:59:54.759320 4724 scope.go:117] "RemoveContainer" containerID="28aaaa47f8700a0b8725c20b81e95579d97b1548baea9e9cea1fec062cd76f79" Oct 02 12:59:54 crc kubenswrapper[4724]: E1002 12:59:54.759706 4724 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-w58lt_openshift-ovn-kubernetes(4089ad23-969c-4222-a8ed-e141ec291e80)\"" pod="openshift-ovn-kubernetes/ovnkube-node-w58lt" podUID="4089ad23-969c-4222-a8ed-e141ec291e80" Oct 02 12:59:54 crc kubenswrapper[4724]: I1002 12:59:54.784246 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"92a0e520-2ed8-4d81-8de4-c0f98916b012\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:58:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:58:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:58:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8d0d24fc1f56963b40bca48e3e923a5b35f9bec48f0b4fabbbdc9163762b2aa6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:58:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cdf23aa5cabd12ba0af299a7184df0341b5a7f7fb6f8159a10c932599fcef08b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:58:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a087f1c52a561564ac405ef652e8cb1542077e507acff22e1189f7370b7ca322\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:58:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a675aca2218d07044b6e9a391f2b60f5cc822790011af97b0c0b704c1bd98d78\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:58:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T12:58:56Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:54Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:54 crc kubenswrapper[4724]: I1002 12:59:54.804346 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-2mrjk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e0d209f5-ad55-48f5-b4de-51aa5a972c19\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e94572c741d70917793b2482faf4c7fba57dfc4a29ade8824b35109735b602c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8f48\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T12:59:21Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-2mrjk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:54Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:54 crc kubenswrapper[4724]: I1002 12:59:54.813528 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 12:59:54 crc kubenswrapper[4724]: I1002 12:59:54.813719 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 12:59:54 crc kubenswrapper[4724]: I1002 12:59:54.813731 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 12:59:54 crc kubenswrapper[4724]: I1002 12:59:54.813753 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 12:59:54 crc kubenswrapper[4724]: I1002 12:59:54.814764 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T12:59:54Z","lastTransitionTime":"2025-10-02T12:59:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 12:59:54 crc kubenswrapper[4724]: I1002 12:59:54.834711 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df76441bbfa05afa242a77e10e6f855d811a9432b790630a9fa5f359ccb6d457\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:54Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:54 crc kubenswrapper[4724]: I1002 12:59:54.852448 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://16ca55ea7e67fb6219b1c835d960d1f9ec6a18a98789ee961e44a278fad9add7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://72eefb1c1aad15d7ad34aa4384c7d0a20a4c886225cb7d28f0597cf01ad35693\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:54Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:54 crc kubenswrapper[4724]: I1002 12:59:54.868041 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:19Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:54Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:54 crc kubenswrapper[4724]: I1002 12:59:54.888247 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-829dv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b90b9e02-3565-4ad5-8f8c-eec339fc499c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://12d1d4558bef1a0a974bf27f9547c646e216ed5001227ab980e89b05c2a7b5ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vjwcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://55a18158dd67ef4667e4a6bc9684c4f4824f848b6e151a0f99c73e5ecfcc5ab1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://55a18158dd67ef4667e4a6bc9684c4f4824f848b6e151a0f99c73e5ecfcc5ab1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T12:59:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T12:59:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vjwcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://db711940b17fde8f8a00b44de8366688eb730facc06a174bf09456c12c673eb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://db711940b17fde8f8a00b44de8366688eb730facc06a174bf09456c12c673eb0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T12:59:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T12:59:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vjwcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a04ebb8ce267454f3068833092d9644e22484d6d3ea6c89e20fe396e5ea1fe9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a04ebb8ce267454f3068833092d9644e22484d6d3ea6c89e20fe396e5ea1fe9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T12:59:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T12:59:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vjwcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://771dcee276aac3571cff4769668348ea4a10d51e792d1604a47c57edac172dec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://771dcee276aac3571cff4769668348ea4a10d51e792d1604a47c57edac172dec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T12:59:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T12:59:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vjwcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c8b3f0e84cfea43942d32d84b09db7348002648527ea7d65a716489f8507eeb5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c8b3f0e84cfea43942d32d84b09db7348002648527ea7d65a716489f8507eeb5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T12:59:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T12:59:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vjwcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1271c0f5d096f2fc134e178807b44e3a5980ca67d64f0e0d73f566b348206a55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1271c0f5d096f2fc134e178807b44e3a5980ca67d64f0e0d73f566b348206a55\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T12:59:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T12:59:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vjwcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T12:59:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-829dv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:54Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:54 crc kubenswrapper[4724]: I1002 12:59:54.904449 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-74k4t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6090eaa-c182-4788-950c-16352c271233\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a3396cd890f4ef8af351a4a3065c2324c20bddc7e079bb78e5a02e9fc8269c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fb6lr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4d5dbca156adf53ca2d3b237e15b6dd4387249e73dbf1a7fb3e7e39c53d1b548\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fb6lr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T12:59:21Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-74k4t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:54Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:54 crc kubenswrapper[4724]: I1002 12:59:54.918450 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 12:59:54 crc kubenswrapper[4724]: I1002 12:59:54.918611 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 12:59:54 crc kubenswrapper[4724]: I1002 12:59:54.918640 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 12:59:54 crc kubenswrapper[4724]: I1002 12:59:54.918732 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 12:59:54 crc kubenswrapper[4724]: I1002 12:59:54.918758 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T12:59:54Z","lastTransitionTime":"2025-10-02T12:59:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 12:59:54 crc kubenswrapper[4724]: I1002 12:59:54.919485 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-q7t2t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"32e04071-6b34-4fc0-9783-f346a72fcf99\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpmhc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpmhc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T12:59:35Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-q7t2t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:54Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:54 crc kubenswrapper[4724]: I1002 12:59:54.937111 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7cb9bd1f-850c-4e5f-bcd0-cd9cdca06604\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:58:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:58:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:58:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4c3fabef61b20063aa2de1008b2b1fda94f39ddd4d0ad905d1a954955ffff7d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:58:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d7393e9d92fc40b10b30d788707b77d3c3e28fb075c618da4c3575dc0a1c1a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:58:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7459703a380feaeff882176dc352b69d3e13089a18d5da98bcebd41018d32091\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:58:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a8e76f0ac3479d575a0b073c5b9ad857711fc06bca31e147167babe0fb164614\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a8e76f0ac3479d575a0b073c5b9ad857711fc06bca31e147167babe0fb164614\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T12:58:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T12:58:57Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T12:58:56Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:54Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:54 crc kubenswrapper[4724]: I1002 12:59:54.964094 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-w58lt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4089ad23-969c-4222-a8ed-e141ec291e80\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://197d101ac60dc4e22bd91f0bccaf0c5d9461d2bb94425b5c6d4ca75aad0f7e25\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sxqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3940d1762b165816101228f10689c4897ca3909b295c51368935e2d28d32e9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sxqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://300ea92580dcf51b685b378397b0dd067899d424f4394bce09a8242415acba64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sxqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6bc33e00e91c1177fe2762cb5295c78f4b00c1861d44446269e2e7668b25ac83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sxqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f4412d5281a9c03ea5b251d17371e363b9c5a970bad0db12adf3f75ddd1f955\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sxqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4186fbd2d282edbe581a4d8d3616dff86c3e6cdbe797999fa57f6034870e7742\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sxqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://28aaaa47f8700a0b8725c20b81e95579d97b1548baea9e9cea1fec062cd76f79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b0f2abd22b65316a586899f71ac99b5451e97b96ff0c19ac2aabe1bb9002e7cf\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-02T12:59:34Z\\\",\\\"message\\\":\\\"nfig-operator_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-machine-config-operator/machine-config-operator\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.4.183\\\\\\\", Port:9001, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI1002 12:59:33.608180 6213 obj_retry.go:386] Retry successful for *v1.Pod openshift-ovn-kubernetes/ovnkube-node-w58lt after 0 failed attempt(s)\\\\nI1002 12:59:33.609167 6213 services_controller.go:452] Built service openshift-machine-config-operator/machine-config-operator per-node LB for network=default: []services.LB{}\\\\nI1002 12:59:33.609171 6213 default_network_controller.go:776] Recording success event on pod openshift-ovn-kubernetes/ovnkube-node-w58lt\\\\nF1002 12:59:33.608123 6213 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, hand\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T12:59:32Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://28aaaa47f8700a0b8725c20b81e95579d97b1548baea9e9cea1fec062cd76f79\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-02T12:59:54Z\\\",\\\"message\\\":\\\"693e1320bd}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1002 12:59:54.044422 6454 obj_retry.go:303] Retry object setup: *v1.Pod openshift-network-console/networking-console-plugin-85b44fc459-gdk6g\\\\nI1002 12:59:54.044420 6454 obj_retry.go:303] Retry object setup: *v1.Pod openshift-network-diagnostics/network-check-source-55646444c4-trplf\\\\nI1002 12:59:54.044433 6454 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-console/networking-console-plugin-85b44fc459-gdk6g\\\\nI1002 12:59:54.044442 6454 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-diagnostics/network-check-source-55646444c4-trplf\\\\nF1002 12:59:54.044455 6454 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T12:59:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sxqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c6cd68ea9e76b6dcb1649fafb808953b30389fcaee674b611c655c239a36e7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sxqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9c7841b20007f0436754e669fb6f1a85de1f0ced1c0b2686ae0a20b56ab878f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9c7841b20007f0436754e669fb6f1a85de1f0ced1c0b2686ae0a20b56ab878f2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T12:59:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T12:59:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sxqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T12:59:21Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-w58lt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:54Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:54 crc kubenswrapper[4724]: I1002 12:59:54.990581 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1527f3c8-7fa1-404b-aafd-7cac512df49d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:58:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:58:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:58:56Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:58:56Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:58:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f058a4a1cf35b7f4800f872139322f2b0096d67548945ba03ecfe6775ffd5103\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:58:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c06801afff9aa03ae9e0bcd8a966f454d81fdfe876806eed22e9d93132989dae\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:58:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff4c45bdd946a4d50c7dc93752d3d1a71b10b830611b64e138a5e1c0eb9e5e30\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:58:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://70f8bb8f5d842643df282ff4c021f3aeba0de7dddc0864e3c1deaf64e604fccc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a3585f00b9d5a19821986d13b0d46f0db3bdd6eef3b4e28f3a63449e0aa0e55\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-02T12:59:19Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1002 12:59:13.975124 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1002 12:59:13.976423 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1894517521/tls.crt::/tmp/serving-cert-1894517521/tls.key\\\\\\\"\\\\nI1002 12:59:19.332110 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1002 12:59:19.335107 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1002 12:59:19.335132 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1002 12:59:19.335161 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1002 12:59:19.335167 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1002 12:59:19.339404 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1002 12:59:19.339433 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 12:59:19.339437 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 12:59:19.339442 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1002 12:59:19.339446 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1002 12:59:19.339452 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1002 12:59:19.339454 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1002 12:59:19.339646 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1002 12:59:19.342508 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T12:59:02Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://25e74bd1660931f7bb8dc824f985c539abcbd4c706a11edf7192876b8796e751\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:58:59Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://366ed0b9a29ba0b7606888087909b51b39abd60dd6b11e6509b8613913461b6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://366ed0b9a29ba0b7606888087909b51b39abd60dd6b11e6509b8613913461b6b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T12:58:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T12:58:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T12:58:56Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:54Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:55 crc kubenswrapper[4724]: I1002 12:59:55.013272 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8902618f9e9502ff68d0658e199abb68025c551774c41022ed9f3f15dcccba47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:55Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:55 crc kubenswrapper[4724]: I1002 12:59:55.021325 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 12:59:55 crc kubenswrapper[4724]: I1002 12:59:55.021872 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 12:59:55 crc kubenswrapper[4724]: I1002 12:59:55.022066 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 12:59:55 crc kubenswrapper[4724]: I1002 12:59:55.022200 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 12:59:55 crc kubenswrapper[4724]: I1002 12:59:55.022343 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T12:59:55Z","lastTransitionTime":"2025-10-02T12:59:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 12:59:55 crc kubenswrapper[4724]: I1002 12:59:55.032306 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:55Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:55 crc kubenswrapper[4724]: I1002 12:59:55.055588 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:19Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:55Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:55 crc kubenswrapper[4724]: I1002 12:59:55.071572 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-pr276" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c67e3c27-01fd-47f5-a7fb-a2ae1e7753d4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5fe9713cb9b004911140ef4802a6755a4d2a781fde439d7318ace93e4b608cef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c844q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T12:59:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-pr276\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:55Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:55 crc kubenswrapper[4724]: I1002 12:59:55.083066 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-vv7gr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"58e6c559-83ff-48ec-b337-ddd00852bc3c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://540350e6cb29bb395f31338d08626d3b110cc3adbe0df070045d178281d9c035\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4spmh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T12:59:26Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-vv7gr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:55Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:55 crc kubenswrapper[4724]: I1002 12:59:55.096602 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-z9692" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aeccba6f-b4bb-4dd5-8ab1-798d7a67251a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://116acf2f64bac5bb00b87c2bf09f16dcec811d4effbacb2ad49e4d2656180b54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znd9t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5971fadd3a2831fd568f1a0e53cefad9dbb4a9c5ad719b4024ec27381842f92e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znd9t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T12:59:33Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-z9692\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:55Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:55 crc kubenswrapper[4724]: I1002 12:59:55.125601 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 12:59:55 crc kubenswrapper[4724]: I1002 12:59:55.125775 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 12:59:55 crc kubenswrapper[4724]: I1002 12:59:55.125862 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 12:59:55 crc kubenswrapper[4724]: I1002 12:59:55.125943 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 12:59:55 crc kubenswrapper[4724]: I1002 12:59:55.126023 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T12:59:55Z","lastTransitionTime":"2025-10-02T12:59:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 12:59:55 crc kubenswrapper[4724]: I1002 12:59:55.229167 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 12:59:55 crc kubenswrapper[4724]: I1002 12:59:55.229244 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 12:59:55 crc kubenswrapper[4724]: I1002 12:59:55.229264 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 12:59:55 crc kubenswrapper[4724]: I1002 12:59:55.229294 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 12:59:55 crc kubenswrapper[4724]: I1002 12:59:55.229318 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T12:59:55Z","lastTransitionTime":"2025-10-02T12:59:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 12:59:55 crc kubenswrapper[4724]: I1002 12:59:55.313513 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 12:59:55 crc kubenswrapper[4724]: I1002 12:59:55.313607 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 12:59:55 crc kubenswrapper[4724]: E1002 12:59:55.313689 4724 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 02 12:59:55 crc kubenswrapper[4724]: I1002 12:59:55.313519 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-q7t2t" Oct 02 12:59:55 crc kubenswrapper[4724]: E1002 12:59:55.313784 4724 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 02 12:59:55 crc kubenswrapper[4724]: I1002 12:59:55.313564 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 12:59:55 crc kubenswrapper[4724]: E1002 12:59:55.313995 4724 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-q7t2t" podUID="32e04071-6b34-4fc0-9783-f346a72fcf99" Oct 02 12:59:55 crc kubenswrapper[4724]: E1002 12:59:55.314046 4724 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 02 12:59:55 crc kubenswrapper[4724]: I1002 12:59:55.332291 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 12:59:55 crc kubenswrapper[4724]: I1002 12:59:55.332340 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 12:59:55 crc kubenswrapper[4724]: I1002 12:59:55.332358 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 12:59:55 crc kubenswrapper[4724]: I1002 12:59:55.332384 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 12:59:55 crc kubenswrapper[4724]: I1002 12:59:55.332406 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T12:59:55Z","lastTransitionTime":"2025-10-02T12:59:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 12:59:55 crc kubenswrapper[4724]: I1002 12:59:55.435648 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 12:59:55 crc kubenswrapper[4724]: I1002 12:59:55.435698 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 12:59:55 crc kubenswrapper[4724]: I1002 12:59:55.435710 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 12:59:55 crc kubenswrapper[4724]: I1002 12:59:55.435728 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 12:59:55 crc kubenswrapper[4724]: I1002 12:59:55.435742 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T12:59:55Z","lastTransitionTime":"2025-10-02T12:59:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 12:59:55 crc kubenswrapper[4724]: I1002 12:59:55.539657 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 12:59:55 crc kubenswrapper[4724]: I1002 12:59:55.539707 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 12:59:55 crc kubenswrapper[4724]: I1002 12:59:55.539718 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 12:59:55 crc kubenswrapper[4724]: I1002 12:59:55.539736 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 12:59:55 crc kubenswrapper[4724]: I1002 12:59:55.539746 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T12:59:55Z","lastTransitionTime":"2025-10-02T12:59:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 12:59:55 crc kubenswrapper[4724]: I1002 12:59:55.642778 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 12:59:55 crc kubenswrapper[4724]: I1002 12:59:55.642836 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 12:59:55 crc kubenswrapper[4724]: I1002 12:59:55.642848 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 12:59:55 crc kubenswrapper[4724]: I1002 12:59:55.642867 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 12:59:55 crc kubenswrapper[4724]: I1002 12:59:55.642884 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T12:59:55Z","lastTransitionTime":"2025-10-02T12:59:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 12:59:55 crc kubenswrapper[4724]: I1002 12:59:55.746161 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 12:59:55 crc kubenswrapper[4724]: I1002 12:59:55.746613 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 12:59:55 crc kubenswrapper[4724]: I1002 12:59:55.746788 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 12:59:55 crc kubenswrapper[4724]: I1002 12:59:55.746950 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 12:59:55 crc kubenswrapper[4724]: I1002 12:59:55.747087 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T12:59:55Z","lastTransitionTime":"2025-10-02T12:59:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 12:59:55 crc kubenswrapper[4724]: I1002 12:59:55.764165 4724 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-w58lt_4089ad23-969c-4222-a8ed-e141ec291e80/ovnkube-controller/2.log" Oct 02 12:59:55 crc kubenswrapper[4724]: I1002 12:59:55.771716 4724 scope.go:117] "RemoveContainer" containerID="28aaaa47f8700a0b8725c20b81e95579d97b1548baea9e9cea1fec062cd76f79" Oct 02 12:59:55 crc kubenswrapper[4724]: E1002 12:59:55.771995 4724 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-w58lt_openshift-ovn-kubernetes(4089ad23-969c-4222-a8ed-e141ec291e80)\"" pod="openshift-ovn-kubernetes/ovnkube-node-w58lt" podUID="4089ad23-969c-4222-a8ed-e141ec291e80" Oct 02 12:59:55 crc kubenswrapper[4724]: I1002 12:59:55.793088 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1527f3c8-7fa1-404b-aafd-7cac512df49d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:58:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:58:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:58:56Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:58:56Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:58:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f058a4a1cf35b7f4800f872139322f2b0096d67548945ba03ecfe6775ffd5103\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:58:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c06801afff9aa03ae9e0bcd8a966f454d81fdfe876806eed22e9d93132989dae\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:58:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff4c45bdd946a4d50c7dc93752d3d1a71b10b830611b64e138a5e1c0eb9e5e30\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:58:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://70f8bb8f5d842643df282ff4c021f3aeba0de7dddc0864e3c1deaf64e604fccc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a3585f00b9d5a19821986d13b0d46f0db3bdd6eef3b4e28f3a63449e0aa0e55\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-02T12:59:19Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1002 12:59:13.975124 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1002 12:59:13.976423 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1894517521/tls.crt::/tmp/serving-cert-1894517521/tls.key\\\\\\\"\\\\nI1002 12:59:19.332110 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1002 12:59:19.335107 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1002 12:59:19.335132 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1002 12:59:19.335161 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1002 12:59:19.335167 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1002 12:59:19.339404 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1002 12:59:19.339433 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 12:59:19.339437 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 12:59:19.339442 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1002 12:59:19.339446 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1002 12:59:19.339452 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1002 12:59:19.339454 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1002 12:59:19.339646 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1002 12:59:19.342508 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T12:59:02Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://25e74bd1660931f7bb8dc824f985c539abcbd4c706a11edf7192876b8796e751\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:58:59Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://366ed0b9a29ba0b7606888087909b51b39abd60dd6b11e6509b8613913461b6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://366ed0b9a29ba0b7606888087909b51b39abd60dd6b11e6509b8613913461b6b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T12:58:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T12:58:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T12:58:56Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:55Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:55 crc kubenswrapper[4724]: I1002 12:59:55.815674 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-w58lt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4089ad23-969c-4222-a8ed-e141ec291e80\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://197d101ac60dc4e22bd91f0bccaf0c5d9461d2bb94425b5c6d4ca75aad0f7e25\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sxqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3940d1762b165816101228f10689c4897ca3909b295c51368935e2d28d32e9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sxqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://300ea92580dcf51b685b378397b0dd067899d424f4394bce09a8242415acba64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sxqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6bc33e00e91c1177fe2762cb5295c78f4b00c1861d44446269e2e7668b25ac83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sxqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f4412d5281a9c03ea5b251d17371e363b9c5a970bad0db12adf3f75ddd1f955\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sxqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4186fbd2d282edbe581a4d8d3616dff86c3e6cdbe797999fa57f6034870e7742\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sxqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://28aaaa47f8700a0b8725c20b81e95579d97b1548baea9e9cea1fec062cd76f79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://28aaaa47f8700a0b8725c20b81e95579d97b1548baea9e9cea1fec062cd76f79\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-02T12:59:54Z\\\",\\\"message\\\":\\\"693e1320bd}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1002 12:59:54.044422 6454 obj_retry.go:303] Retry object setup: *v1.Pod openshift-network-console/networking-console-plugin-85b44fc459-gdk6g\\\\nI1002 12:59:54.044420 6454 obj_retry.go:303] Retry object setup: *v1.Pod openshift-network-diagnostics/network-check-source-55646444c4-trplf\\\\nI1002 12:59:54.044433 6454 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-console/networking-console-plugin-85b44fc459-gdk6g\\\\nI1002 12:59:54.044442 6454 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-diagnostics/network-check-source-55646444c4-trplf\\\\nF1002 12:59:54.044455 6454 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T12:59:52Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-w58lt_openshift-ovn-kubernetes(4089ad23-969c-4222-a8ed-e141ec291e80)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sxqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c6cd68ea9e76b6dcb1649fafb808953b30389fcaee674b611c655c239a36e7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sxqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9c7841b20007f0436754e669fb6f1a85de1f0ced1c0b2686ae0a20b56ab878f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9c7841b20007f0436754e669fb6f1a85de1f0ced1c0b2686ae0a20b56ab878f2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T12:59:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T12:59:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sxqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T12:59:21Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-w58lt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:55Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:55 crc kubenswrapper[4724]: I1002 12:59:55.828783 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-vv7gr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"58e6c559-83ff-48ec-b337-ddd00852bc3c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://540350e6cb29bb395f31338d08626d3b110cc3adbe0df070045d178281d9c035\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4spmh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T12:59:26Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-vv7gr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:55Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:55 crc kubenswrapper[4724]: I1002 12:59:55.844166 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-z9692" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aeccba6f-b4bb-4dd5-8ab1-798d7a67251a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://116acf2f64bac5bb00b87c2bf09f16dcec811d4effbacb2ad49e4d2656180b54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znd9t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5971fadd3a2831fd568f1a0e53cefad9dbb4a9c5ad719b4024ec27381842f92e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znd9t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T12:59:33Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-z9692\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:55Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:55 crc kubenswrapper[4724]: I1002 12:59:55.849551 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 12:59:55 crc kubenswrapper[4724]: I1002 12:59:55.849597 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 12:59:55 crc kubenswrapper[4724]: I1002 12:59:55.849606 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 12:59:55 crc kubenswrapper[4724]: I1002 12:59:55.849623 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 12:59:55 crc kubenswrapper[4724]: I1002 12:59:55.849635 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T12:59:55Z","lastTransitionTime":"2025-10-02T12:59:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 12:59:55 crc kubenswrapper[4724]: I1002 12:59:55.859427 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8902618f9e9502ff68d0658e199abb68025c551774c41022ed9f3f15dcccba47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:55Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:55 crc kubenswrapper[4724]: I1002 12:59:55.880046 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:55Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:55 crc kubenswrapper[4724]: I1002 12:59:55.893863 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:19Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:55Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:55 crc kubenswrapper[4724]: I1002 12:59:55.909328 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-pr276" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c67e3c27-01fd-47f5-a7fb-a2ae1e7753d4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5fe9713cb9b004911140ef4802a6755a4d2a781fde439d7318ace93e4b608cef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c844q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T12:59:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-pr276\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:55Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:55 crc kubenswrapper[4724]: I1002 12:59:55.923672 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"92a0e520-2ed8-4d81-8de4-c0f98916b012\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:58:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:58:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:58:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8d0d24fc1f56963b40bca48e3e923a5b35f9bec48f0b4fabbbdc9163762b2aa6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:58:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cdf23aa5cabd12ba0af299a7184df0341b5a7f7fb6f8159a10c932599fcef08b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:58:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a087f1c52a561564ac405ef652e8cb1542077e507acff22e1189f7370b7ca322\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:58:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a675aca2218d07044b6e9a391f2b60f5cc822790011af97b0c0b704c1bd98d78\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:58:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T12:58:56Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:55Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:55 crc kubenswrapper[4724]: I1002 12:59:55.933382 4724 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 02 12:59:55 crc kubenswrapper[4724]: I1002 12:59:55.938587 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-2mrjk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e0d209f5-ad55-48f5-b4de-51aa5a972c19\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e94572c741d70917793b2482faf4c7fba57dfc4a29ade8824b35109735b602c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8f48\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T12:59:21Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-2mrjk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:55Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:55 crc kubenswrapper[4724]: I1002 12:59:55.951736 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 12:59:55 crc kubenswrapper[4724]: I1002 12:59:55.951773 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 12:59:55 crc kubenswrapper[4724]: I1002 12:59:55.951783 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 12:59:55 crc kubenswrapper[4724]: I1002 12:59:55.951801 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 12:59:55 crc kubenswrapper[4724]: I1002 12:59:55.951815 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T12:59:55Z","lastTransitionTime":"2025-10-02T12:59:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 12:59:55 crc kubenswrapper[4724]: I1002 12:59:55.952862 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-74k4t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6090eaa-c182-4788-950c-16352c271233\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a3396cd890f4ef8af351a4a3065c2324c20bddc7e079bb78e5a02e9fc8269c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fb6lr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4d5dbca156adf53ca2d3b237e15b6dd4387249e73dbf1a7fb3e7e39c53d1b548\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fb6lr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T12:59:21Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-74k4t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:55Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:55 crc kubenswrapper[4724]: I1002 12:59:55.965052 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-q7t2t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"32e04071-6b34-4fc0-9783-f346a72fcf99\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpmhc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpmhc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T12:59:35Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-q7t2t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:55Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:55 crc kubenswrapper[4724]: I1002 12:59:55.979250 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7cb9bd1f-850c-4e5f-bcd0-cd9cdca06604\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:58:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:58:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:58:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4c3fabef61b20063aa2de1008b2b1fda94f39ddd4d0ad905d1a954955ffff7d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:58:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d7393e9d92fc40b10b30d788707b77d3c3e28fb075c618da4c3575dc0a1c1a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:58:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7459703a380feaeff882176dc352b69d3e13089a18d5da98bcebd41018d32091\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:58:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a8e76f0ac3479d575a0b073c5b9ad857711fc06bca31e147167babe0fb164614\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a8e76f0ac3479d575a0b073c5b9ad857711fc06bca31e147167babe0fb164614\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T12:58:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T12:58:57Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T12:58:56Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:55Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:55 crc kubenswrapper[4724]: I1002 12:59:55.995121 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df76441bbfa05afa242a77e10e6f855d811a9432b790630a9fa5f359ccb6d457\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:55Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:56 crc kubenswrapper[4724]: I1002 12:59:56.008045 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://16ca55ea7e67fb6219b1c835d960d1f9ec6a18a98789ee961e44a278fad9add7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://72eefb1c1aad15d7ad34aa4384c7d0a20a4c886225cb7d28f0597cf01ad35693\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:56Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:56 crc kubenswrapper[4724]: I1002 12:59:56.024769 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:19Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:56Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:56 crc kubenswrapper[4724]: I1002 12:59:56.044836 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-829dv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b90b9e02-3565-4ad5-8f8c-eec339fc499c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://12d1d4558bef1a0a974bf27f9547c646e216ed5001227ab980e89b05c2a7b5ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vjwcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://55a18158dd67ef4667e4a6bc9684c4f4824f848b6e151a0f99c73e5ecfcc5ab1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://55a18158dd67ef4667e4a6bc9684c4f4824f848b6e151a0f99c73e5ecfcc5ab1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T12:59:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T12:59:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vjwcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://db711940b17fde8f8a00b44de8366688eb730facc06a174bf09456c12c673eb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://db711940b17fde8f8a00b44de8366688eb730facc06a174bf09456c12c673eb0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T12:59:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T12:59:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vjwcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a04ebb8ce267454f3068833092d9644e22484d6d3ea6c89e20fe396e5ea1fe9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a04ebb8ce267454f3068833092d9644e22484d6d3ea6c89e20fe396e5ea1fe9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T12:59:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T12:59:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vjwcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://771dcee276aac3571cff4769668348ea4a10d51e792d1604a47c57edac172dec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://771dcee276aac3571cff4769668348ea4a10d51e792d1604a47c57edac172dec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T12:59:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T12:59:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vjwcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c8b3f0e84cfea43942d32d84b09db7348002648527ea7d65a716489f8507eeb5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c8b3f0e84cfea43942d32d84b09db7348002648527ea7d65a716489f8507eeb5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T12:59:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T12:59:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vjwcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1271c0f5d096f2fc134e178807b44e3a5980ca67d64f0e0d73f566b348206a55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1271c0f5d096f2fc134e178807b44e3a5980ca67d64f0e0d73f566b348206a55\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T12:59:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T12:59:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vjwcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T12:59:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-829dv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:56Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:56 crc kubenswrapper[4724]: I1002 12:59:56.055420 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 12:59:56 crc kubenswrapper[4724]: I1002 12:59:56.055479 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 12:59:56 crc kubenswrapper[4724]: I1002 12:59:56.055492 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 12:59:56 crc kubenswrapper[4724]: I1002 12:59:56.055513 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 12:59:56 crc kubenswrapper[4724]: I1002 12:59:56.055527 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T12:59:56Z","lastTransitionTime":"2025-10-02T12:59:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 12:59:56 crc kubenswrapper[4724]: I1002 12:59:56.063066 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8902618f9e9502ff68d0658e199abb68025c551774c41022ed9f3f15dcccba47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:56Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:56 crc kubenswrapper[4724]: I1002 12:59:56.080407 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:56Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:56 crc kubenswrapper[4724]: I1002 12:59:56.096336 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:19Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:56Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:56 crc kubenswrapper[4724]: I1002 12:59:56.116406 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-pr276" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c67e3c27-01fd-47f5-a7fb-a2ae1e7753d4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5fe9713cb9b004911140ef4802a6755a4d2a781fde439d7318ace93e4b608cef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c844q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T12:59:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-pr276\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:56Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:56 crc kubenswrapper[4724]: I1002 12:59:56.128818 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-vv7gr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"58e6c559-83ff-48ec-b337-ddd00852bc3c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://540350e6cb29bb395f31338d08626d3b110cc3adbe0df070045d178281d9c035\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4spmh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T12:59:26Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-vv7gr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:56Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:56 crc kubenswrapper[4724]: I1002 12:59:56.141928 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-z9692" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aeccba6f-b4bb-4dd5-8ab1-798d7a67251a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://116acf2f64bac5bb00b87c2bf09f16dcec811d4effbacb2ad49e4d2656180b54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znd9t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5971fadd3a2831fd568f1a0e53cefad9dbb4a9c5ad719b4024ec27381842f92e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znd9t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T12:59:33Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-z9692\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:56Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:56 crc kubenswrapper[4724]: I1002 12:59:56.156526 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"92a0e520-2ed8-4d81-8de4-c0f98916b012\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:58:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:58:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:58:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8d0d24fc1f56963b40bca48e3e923a5b35f9bec48f0b4fabbbdc9163762b2aa6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:58:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cdf23aa5cabd12ba0af299a7184df0341b5a7f7fb6f8159a10c932599fcef08b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:58:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a087f1c52a561564ac405ef652e8cb1542077e507acff22e1189f7370b7ca322\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:58:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a675aca2218d07044b6e9a391f2b60f5cc822790011af97b0c0b704c1bd98d78\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:58:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T12:58:56Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:56Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:56 crc kubenswrapper[4724]: I1002 12:59:56.158174 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 12:59:56 crc kubenswrapper[4724]: I1002 12:59:56.158237 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 12:59:56 crc kubenswrapper[4724]: I1002 12:59:56.158252 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 12:59:56 crc kubenswrapper[4724]: I1002 12:59:56.158270 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 12:59:56 crc kubenswrapper[4724]: I1002 12:59:56.158283 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T12:59:56Z","lastTransitionTime":"2025-10-02T12:59:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 12:59:56 crc kubenswrapper[4724]: I1002 12:59:56.169474 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-2mrjk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e0d209f5-ad55-48f5-b4de-51aa5a972c19\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e94572c741d70917793b2482faf4c7fba57dfc4a29ade8824b35109735b602c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8f48\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T12:59:21Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-2mrjk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:56Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:56 crc kubenswrapper[4724]: I1002 12:59:56.182108 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7cb9bd1f-850c-4e5f-bcd0-cd9cdca06604\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:58:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:58:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:58:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4c3fabef61b20063aa2de1008b2b1fda94f39ddd4d0ad905d1a954955ffff7d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:58:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d7393e9d92fc40b10b30d788707b77d3c3e28fb075c618da4c3575dc0a1c1a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:58:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7459703a380feaeff882176dc352b69d3e13089a18d5da98bcebd41018d32091\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:58:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a8e76f0ac3479d575a0b073c5b9ad857711fc06bca31e147167babe0fb164614\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a8e76f0ac3479d575a0b073c5b9ad857711fc06bca31e147167babe0fb164614\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T12:58:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T12:58:57Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T12:58:56Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:56Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:56 crc kubenswrapper[4724]: I1002 12:59:56.192976 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df76441bbfa05afa242a77e10e6f855d811a9432b790630a9fa5f359ccb6d457\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:56Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:56 crc kubenswrapper[4724]: I1002 12:59:56.205186 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://16ca55ea7e67fb6219b1c835d960d1f9ec6a18a98789ee961e44a278fad9add7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://72eefb1c1aad15d7ad34aa4384c7d0a20a4c886225cb7d28f0597cf01ad35693\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:56Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:56 crc kubenswrapper[4724]: I1002 12:59:56.226864 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:19Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:56Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:56 crc kubenswrapper[4724]: I1002 12:59:56.249033 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-829dv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b90b9e02-3565-4ad5-8f8c-eec339fc499c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://12d1d4558bef1a0a974bf27f9547c646e216ed5001227ab980e89b05c2a7b5ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vjwcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://55a18158dd67ef4667e4a6bc9684c4f4824f848b6e151a0f99c73e5ecfcc5ab1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://55a18158dd67ef4667e4a6bc9684c4f4824f848b6e151a0f99c73e5ecfcc5ab1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T12:59:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T12:59:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vjwcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://db711940b17fde8f8a00b44de8366688eb730facc06a174bf09456c12c673eb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://db711940b17fde8f8a00b44de8366688eb730facc06a174bf09456c12c673eb0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T12:59:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T12:59:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vjwcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a04ebb8ce267454f3068833092d9644e22484d6d3ea6c89e20fe396e5ea1fe9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a04ebb8ce267454f3068833092d9644e22484d6d3ea6c89e20fe396e5ea1fe9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T12:59:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T12:59:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vjwcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://771dcee276aac3571cff4769668348ea4a10d51e792d1604a47c57edac172dec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://771dcee276aac3571cff4769668348ea4a10d51e792d1604a47c57edac172dec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T12:59:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T12:59:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vjwcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c8b3f0e84cfea43942d32d84b09db7348002648527ea7d65a716489f8507eeb5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c8b3f0e84cfea43942d32d84b09db7348002648527ea7d65a716489f8507eeb5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T12:59:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T12:59:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vjwcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1271c0f5d096f2fc134e178807b44e3a5980ca67d64f0e0d73f566b348206a55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1271c0f5d096f2fc134e178807b44e3a5980ca67d64f0e0d73f566b348206a55\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T12:59:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T12:59:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vjwcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T12:59:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-829dv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:56Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:56 crc kubenswrapper[4724]: I1002 12:59:56.260768 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 12:59:56 crc kubenswrapper[4724]: I1002 12:59:56.260833 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 12:59:56 crc kubenswrapper[4724]: I1002 12:59:56.260851 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 12:59:56 crc kubenswrapper[4724]: I1002 12:59:56.260879 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 12:59:56 crc kubenswrapper[4724]: I1002 12:59:56.260898 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T12:59:56Z","lastTransitionTime":"2025-10-02T12:59:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 12:59:56 crc kubenswrapper[4724]: I1002 12:59:56.265189 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-74k4t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6090eaa-c182-4788-950c-16352c271233\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a3396cd890f4ef8af351a4a3065c2324c20bddc7e079bb78e5a02e9fc8269c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fb6lr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4d5dbca156adf53ca2d3b237e15b6dd4387249e73dbf1a7fb3e7e39c53d1b548\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fb6lr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T12:59:21Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-74k4t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:56Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:56 crc kubenswrapper[4724]: I1002 12:59:56.280426 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-q7t2t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"32e04071-6b34-4fc0-9783-f346a72fcf99\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpmhc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpmhc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T12:59:35Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-q7t2t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:56Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:56 crc kubenswrapper[4724]: I1002 12:59:56.296682 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1527f3c8-7fa1-404b-aafd-7cac512df49d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:58:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:58:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:58:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f058a4a1cf35b7f4800f872139322f2b0096d67548945ba03ecfe6775ffd5103\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:58:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c06801afff9aa03ae9e0bcd8a966f454d81fdfe876806eed22e9d93132989dae\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:58:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff4c45bdd946a4d50c7dc93752d3d1a71b10b830611b64e138a5e1c0eb9e5e30\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:58:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://70f8bb8f5d842643df282ff4c021f3aeba0de7dddc0864e3c1deaf64e604fccc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a3585f00b9d5a19821986d13b0d46f0db3bdd6eef3b4e28f3a63449e0aa0e55\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-02T12:59:19Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1002 12:59:13.975124 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1002 12:59:13.976423 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1894517521/tls.crt::/tmp/serving-cert-1894517521/tls.key\\\\\\\"\\\\nI1002 12:59:19.332110 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1002 12:59:19.335107 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1002 12:59:19.335132 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1002 12:59:19.335161 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1002 12:59:19.335167 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1002 12:59:19.339404 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1002 12:59:19.339433 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 12:59:19.339437 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 12:59:19.339442 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1002 12:59:19.339446 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1002 12:59:19.339452 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1002 12:59:19.339454 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1002 12:59:19.339646 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1002 12:59:19.342508 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T12:59:02Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://25e74bd1660931f7bb8dc824f985c539abcbd4c706a11edf7192876b8796e751\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:58:59Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://366ed0b9a29ba0b7606888087909b51b39abd60dd6b11e6509b8613913461b6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://366ed0b9a29ba0b7606888087909b51b39abd60dd6b11e6509b8613913461b6b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T12:58:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T12:58:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T12:58:56Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:56Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:56 crc kubenswrapper[4724]: I1002 12:59:56.329018 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-w58lt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4089ad23-969c-4222-a8ed-e141ec291e80\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://197d101ac60dc4e22bd91f0bccaf0c5d9461d2bb94425b5c6d4ca75aad0f7e25\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sxqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3940d1762b165816101228f10689c4897ca3909b295c51368935e2d28d32e9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sxqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://300ea92580dcf51b685b378397b0dd067899d424f4394bce09a8242415acba64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sxqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6bc33e00e91c1177fe2762cb5295c78f4b00c1861d44446269e2e7668b25ac83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sxqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f4412d5281a9c03ea5b251d17371e363b9c5a970bad0db12adf3f75ddd1f955\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sxqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4186fbd2d282edbe581a4d8d3616dff86c3e6cdbe797999fa57f6034870e7742\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sxqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://28aaaa47f8700a0b8725c20b81e95579d97b1548baea9e9cea1fec062cd76f79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://28aaaa47f8700a0b8725c20b81e95579d97b1548baea9e9cea1fec062cd76f79\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-02T12:59:54Z\\\",\\\"message\\\":\\\"693e1320bd}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1002 12:59:54.044422 6454 obj_retry.go:303] Retry object setup: *v1.Pod openshift-network-console/networking-console-plugin-85b44fc459-gdk6g\\\\nI1002 12:59:54.044420 6454 obj_retry.go:303] Retry object setup: *v1.Pod openshift-network-diagnostics/network-check-source-55646444c4-trplf\\\\nI1002 12:59:54.044433 6454 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-console/networking-console-plugin-85b44fc459-gdk6g\\\\nI1002 12:59:54.044442 6454 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-diagnostics/network-check-source-55646444c4-trplf\\\\nF1002 12:59:54.044455 6454 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T12:59:52Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-w58lt_openshift-ovn-kubernetes(4089ad23-969c-4222-a8ed-e141ec291e80)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sxqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c6cd68ea9e76b6dcb1649fafb808953b30389fcaee674b611c655c239a36e7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sxqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9c7841b20007f0436754e669fb6f1a85de1f0ced1c0b2686ae0a20b56ab878f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9c7841b20007f0436754e669fb6f1a85de1f0ced1c0b2686ae0a20b56ab878f2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T12:59:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T12:59:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sxqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T12:59:21Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-w58lt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:56Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:56 crc kubenswrapper[4724]: I1002 12:59:56.342123 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-vv7gr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"58e6c559-83ff-48ec-b337-ddd00852bc3c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://540350e6cb29bb395f31338d08626d3b110cc3adbe0df070045d178281d9c035\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4spmh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T12:59:26Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-vv7gr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:56Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:56 crc kubenswrapper[4724]: I1002 12:59:56.358436 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-z9692" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aeccba6f-b4bb-4dd5-8ab1-798d7a67251a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://116acf2f64bac5bb00b87c2bf09f16dcec811d4effbacb2ad49e4d2656180b54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znd9t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5971fadd3a2831fd568f1a0e53cefad9dbb4a9c5ad719b4024ec27381842f92e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znd9t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T12:59:33Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-z9692\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:56Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:56 crc kubenswrapper[4724]: I1002 12:59:56.363499 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 12:59:56 crc kubenswrapper[4724]: I1002 12:59:56.363584 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 12:59:56 crc kubenswrapper[4724]: I1002 12:59:56.363600 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 12:59:56 crc kubenswrapper[4724]: I1002 12:59:56.363624 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 12:59:56 crc kubenswrapper[4724]: I1002 12:59:56.363638 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T12:59:56Z","lastTransitionTime":"2025-10-02T12:59:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 12:59:56 crc kubenswrapper[4724]: I1002 12:59:56.375450 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8902618f9e9502ff68d0658e199abb68025c551774c41022ed9f3f15dcccba47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:56Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:56 crc kubenswrapper[4724]: I1002 12:59:56.394373 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:56Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:56 crc kubenswrapper[4724]: I1002 12:59:56.410048 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:19Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:56Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:56 crc kubenswrapper[4724]: I1002 12:59:56.425969 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-pr276" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c67e3c27-01fd-47f5-a7fb-a2ae1e7753d4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5fe9713cb9b004911140ef4802a6755a4d2a781fde439d7318ace93e4b608cef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c844q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T12:59:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-pr276\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:56Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:56 crc kubenswrapper[4724]: I1002 12:59:56.442420 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"92a0e520-2ed8-4d81-8de4-c0f98916b012\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:58:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:58:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:58:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8d0d24fc1f56963b40bca48e3e923a5b35f9bec48f0b4fabbbdc9163762b2aa6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:58:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cdf23aa5cabd12ba0af299a7184df0341b5a7f7fb6f8159a10c932599fcef08b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:58:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a087f1c52a561564ac405ef652e8cb1542077e507acff22e1189f7370b7ca322\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:58:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a675aca2218d07044b6e9a391f2b60f5cc822790011af97b0c0b704c1bd98d78\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:58:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T12:58:56Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:56Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:56 crc kubenswrapper[4724]: I1002 12:59:56.455867 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-2mrjk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e0d209f5-ad55-48f5-b4de-51aa5a972c19\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e94572c741d70917793b2482faf4c7fba57dfc4a29ade8824b35109735b602c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8f48\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T12:59:21Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-2mrjk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:56Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:56 crc kubenswrapper[4724]: I1002 12:59:56.466333 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 12:59:56 crc kubenswrapper[4724]: I1002 12:59:56.466389 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 12:59:56 crc kubenswrapper[4724]: I1002 12:59:56.466399 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 12:59:56 crc kubenswrapper[4724]: I1002 12:59:56.466440 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 12:59:56 crc kubenswrapper[4724]: I1002 12:59:56.466454 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T12:59:56Z","lastTransitionTime":"2025-10-02T12:59:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 12:59:56 crc kubenswrapper[4724]: I1002 12:59:56.470700 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-74k4t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6090eaa-c182-4788-950c-16352c271233\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a3396cd890f4ef8af351a4a3065c2324c20bddc7e079bb78e5a02e9fc8269c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fb6lr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4d5dbca156adf53ca2d3b237e15b6dd4387249e73dbf1a7fb3e7e39c53d1b548\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fb6lr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T12:59:21Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-74k4t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:56Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:56 crc kubenswrapper[4724]: I1002 12:59:56.482364 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-q7t2t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"32e04071-6b34-4fc0-9783-f346a72fcf99\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpmhc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpmhc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T12:59:35Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-q7t2t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:56Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:56 crc kubenswrapper[4724]: I1002 12:59:56.493792 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7cb9bd1f-850c-4e5f-bcd0-cd9cdca06604\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:58:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:58:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:58:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4c3fabef61b20063aa2de1008b2b1fda94f39ddd4d0ad905d1a954955ffff7d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:58:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d7393e9d92fc40b10b30d788707b77d3c3e28fb075c618da4c3575dc0a1c1a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:58:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7459703a380feaeff882176dc352b69d3e13089a18d5da98bcebd41018d32091\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:58:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a8e76f0ac3479d575a0b073c5b9ad857711fc06bca31e147167babe0fb164614\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a8e76f0ac3479d575a0b073c5b9ad857711fc06bca31e147167babe0fb164614\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T12:58:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T12:58:57Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T12:58:56Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:56Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:56 crc kubenswrapper[4724]: I1002 12:59:56.506214 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df76441bbfa05afa242a77e10e6f855d811a9432b790630a9fa5f359ccb6d457\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:56Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:56 crc kubenswrapper[4724]: I1002 12:59:56.518665 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://16ca55ea7e67fb6219b1c835d960d1f9ec6a18a98789ee961e44a278fad9add7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://72eefb1c1aad15d7ad34aa4384c7d0a20a4c886225cb7d28f0597cf01ad35693\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:56Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:56 crc kubenswrapper[4724]: I1002 12:59:56.531933 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:19Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:56Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:56 crc kubenswrapper[4724]: I1002 12:59:56.554420 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-829dv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b90b9e02-3565-4ad5-8f8c-eec339fc499c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://12d1d4558bef1a0a974bf27f9547c646e216ed5001227ab980e89b05c2a7b5ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vjwcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://55a18158dd67ef4667e4a6bc9684c4f4824f848b6e151a0f99c73e5ecfcc5ab1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://55a18158dd67ef4667e4a6bc9684c4f4824f848b6e151a0f99c73e5ecfcc5ab1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T12:59:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T12:59:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vjwcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://db711940b17fde8f8a00b44de8366688eb730facc06a174bf09456c12c673eb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://db711940b17fde8f8a00b44de8366688eb730facc06a174bf09456c12c673eb0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T12:59:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T12:59:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vjwcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a04ebb8ce267454f3068833092d9644e22484d6d3ea6c89e20fe396e5ea1fe9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a04ebb8ce267454f3068833092d9644e22484d6d3ea6c89e20fe396e5ea1fe9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T12:59:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T12:59:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vjwcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://771dcee276aac3571cff4769668348ea4a10d51e792d1604a47c57edac172dec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://771dcee276aac3571cff4769668348ea4a10d51e792d1604a47c57edac172dec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T12:59:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T12:59:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vjwcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c8b3f0e84cfea43942d32d84b09db7348002648527ea7d65a716489f8507eeb5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c8b3f0e84cfea43942d32d84b09db7348002648527ea7d65a716489f8507eeb5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T12:59:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T12:59:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vjwcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1271c0f5d096f2fc134e178807b44e3a5980ca67d64f0e0d73f566b348206a55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1271c0f5d096f2fc134e178807b44e3a5980ca67d64f0e0d73f566b348206a55\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T12:59:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T12:59:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vjwcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T12:59:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-829dv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:56Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:56 crc kubenswrapper[4724]: I1002 12:59:56.569369 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 12:59:56 crc kubenswrapper[4724]: I1002 12:59:56.569412 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 12:59:56 crc kubenswrapper[4724]: I1002 12:59:56.569423 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 12:59:56 crc kubenswrapper[4724]: I1002 12:59:56.569443 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 12:59:56 crc kubenswrapper[4724]: I1002 12:59:56.569459 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T12:59:56Z","lastTransitionTime":"2025-10-02T12:59:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 12:59:56 crc kubenswrapper[4724]: I1002 12:59:56.576780 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1527f3c8-7fa1-404b-aafd-7cac512df49d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:58:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:58:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:58:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f058a4a1cf35b7f4800f872139322f2b0096d67548945ba03ecfe6775ffd5103\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:58:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c06801afff9aa03ae9e0bcd8a966f454d81fdfe876806eed22e9d93132989dae\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:58:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff4c45bdd946a4d50c7dc93752d3d1a71b10b830611b64e138a5e1c0eb9e5e30\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:58:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://70f8bb8f5d842643df282ff4c021f3aeba0de7dddc0864e3c1deaf64e604fccc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a3585f00b9d5a19821986d13b0d46f0db3bdd6eef3b4e28f3a63449e0aa0e55\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-02T12:59:19Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1002 12:59:13.975124 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1002 12:59:13.976423 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1894517521/tls.crt::/tmp/serving-cert-1894517521/tls.key\\\\\\\"\\\\nI1002 12:59:19.332110 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1002 12:59:19.335107 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1002 12:59:19.335132 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1002 12:59:19.335161 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1002 12:59:19.335167 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1002 12:59:19.339404 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1002 12:59:19.339433 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 12:59:19.339437 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 12:59:19.339442 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1002 12:59:19.339446 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1002 12:59:19.339452 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1002 12:59:19.339454 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1002 12:59:19.339646 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1002 12:59:19.342508 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T12:59:02Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://25e74bd1660931f7bb8dc824f985c539abcbd4c706a11edf7192876b8796e751\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:58:59Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://366ed0b9a29ba0b7606888087909b51b39abd60dd6b11e6509b8613913461b6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://366ed0b9a29ba0b7606888087909b51b39abd60dd6b11e6509b8613913461b6b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T12:58:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T12:58:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T12:58:56Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:56Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:56 crc kubenswrapper[4724]: I1002 12:59:56.596871 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-w58lt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4089ad23-969c-4222-a8ed-e141ec291e80\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://197d101ac60dc4e22bd91f0bccaf0c5d9461d2bb94425b5c6d4ca75aad0f7e25\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sxqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3940d1762b165816101228f10689c4897ca3909b295c51368935e2d28d32e9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sxqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://300ea92580dcf51b685b378397b0dd067899d424f4394bce09a8242415acba64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sxqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6bc33e00e91c1177fe2762cb5295c78f4b00c1861d44446269e2e7668b25ac83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sxqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f4412d5281a9c03ea5b251d17371e363b9c5a970bad0db12adf3f75ddd1f955\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sxqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4186fbd2d282edbe581a4d8d3616dff86c3e6cdbe797999fa57f6034870e7742\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sxqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://28aaaa47f8700a0b8725c20b81e95579d97b1548baea9e9cea1fec062cd76f79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://28aaaa47f8700a0b8725c20b81e95579d97b1548baea9e9cea1fec062cd76f79\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-02T12:59:54Z\\\",\\\"message\\\":\\\"693e1320bd}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1002 12:59:54.044422 6454 obj_retry.go:303] Retry object setup: *v1.Pod openshift-network-console/networking-console-plugin-85b44fc459-gdk6g\\\\nI1002 12:59:54.044420 6454 obj_retry.go:303] Retry object setup: *v1.Pod openshift-network-diagnostics/network-check-source-55646444c4-trplf\\\\nI1002 12:59:54.044433 6454 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-console/networking-console-plugin-85b44fc459-gdk6g\\\\nI1002 12:59:54.044442 6454 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-diagnostics/network-check-source-55646444c4-trplf\\\\nF1002 12:59:54.044455 6454 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T12:59:52Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-w58lt_openshift-ovn-kubernetes(4089ad23-969c-4222-a8ed-e141ec291e80)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sxqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c6cd68ea9e76b6dcb1649fafb808953b30389fcaee674b611c655c239a36e7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sxqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9c7841b20007f0436754e669fb6f1a85de1f0ced1c0b2686ae0a20b56ab878f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9c7841b20007f0436754e669fb6f1a85de1f0ced1c0b2686ae0a20b56ab878f2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T12:59:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T12:59:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sxqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T12:59:21Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-w58lt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:56Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:56 crc kubenswrapper[4724]: I1002 12:59:56.672508 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 12:59:56 crc kubenswrapper[4724]: I1002 12:59:56.672560 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 12:59:56 crc kubenswrapper[4724]: I1002 12:59:56.672569 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 12:59:56 crc kubenswrapper[4724]: I1002 12:59:56.672583 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 12:59:56 crc kubenswrapper[4724]: I1002 12:59:56.672593 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T12:59:56Z","lastTransitionTime":"2025-10-02T12:59:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 12:59:56 crc kubenswrapper[4724]: I1002 12:59:56.775289 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 12:59:56 crc kubenswrapper[4724]: I1002 12:59:56.775334 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 12:59:56 crc kubenswrapper[4724]: I1002 12:59:56.775343 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 12:59:56 crc kubenswrapper[4724]: I1002 12:59:56.775357 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 12:59:56 crc kubenswrapper[4724]: I1002 12:59:56.775367 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T12:59:56Z","lastTransitionTime":"2025-10-02T12:59:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 12:59:56 crc kubenswrapper[4724]: I1002 12:59:56.878258 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 12:59:56 crc kubenswrapper[4724]: I1002 12:59:56.878304 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 12:59:56 crc kubenswrapper[4724]: I1002 12:59:56.878315 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 12:59:56 crc kubenswrapper[4724]: I1002 12:59:56.878333 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 12:59:56 crc kubenswrapper[4724]: I1002 12:59:56.878346 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T12:59:56Z","lastTransitionTime":"2025-10-02T12:59:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 12:59:56 crc kubenswrapper[4724]: I1002 12:59:56.981698 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 12:59:56 crc kubenswrapper[4724]: I1002 12:59:56.981743 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 12:59:56 crc kubenswrapper[4724]: I1002 12:59:56.981754 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 12:59:56 crc kubenswrapper[4724]: I1002 12:59:56.981771 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 12:59:56 crc kubenswrapper[4724]: I1002 12:59:56.981784 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T12:59:56Z","lastTransitionTime":"2025-10-02T12:59:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 12:59:57 crc kubenswrapper[4724]: I1002 12:59:57.005217 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 12:59:57 crc kubenswrapper[4724]: I1002 12:59:57.005280 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 12:59:57 crc kubenswrapper[4724]: I1002 12:59:57.005295 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 12:59:57 crc kubenswrapper[4724]: I1002 12:59:57.005321 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 12:59:57 crc kubenswrapper[4724]: I1002 12:59:57.005338 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T12:59:57Z","lastTransitionTime":"2025-10-02T12:59:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 12:59:57 crc kubenswrapper[4724]: E1002 12:59:57.020243 4724 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T12:59:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T12:59:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:57Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T12:59:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T12:59:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:57Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"1650a7ad-7086-4fb1-9f9b-b9368db16424\\\",\\\"systemUUID\\\":\\\"5e560baf-345b-4d65-984c-1cfbf6a74dd2\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:57Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:57 crc kubenswrapper[4724]: I1002 12:59:57.024345 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 12:59:57 crc kubenswrapper[4724]: I1002 12:59:57.024387 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 12:59:57 crc kubenswrapper[4724]: I1002 12:59:57.024401 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 12:59:57 crc kubenswrapper[4724]: I1002 12:59:57.024422 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 12:59:57 crc kubenswrapper[4724]: I1002 12:59:57.024438 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T12:59:57Z","lastTransitionTime":"2025-10-02T12:59:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 12:59:57 crc kubenswrapper[4724]: E1002 12:59:57.037452 4724 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T12:59:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T12:59:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:57Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T12:59:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T12:59:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:57Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"1650a7ad-7086-4fb1-9f9b-b9368db16424\\\",\\\"systemUUID\\\":\\\"5e560baf-345b-4d65-984c-1cfbf6a74dd2\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:57Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:57 crc kubenswrapper[4724]: I1002 12:59:57.041390 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 12:59:57 crc kubenswrapper[4724]: I1002 12:59:57.041422 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 12:59:57 crc kubenswrapper[4724]: I1002 12:59:57.041433 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 12:59:57 crc kubenswrapper[4724]: I1002 12:59:57.041451 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 12:59:57 crc kubenswrapper[4724]: I1002 12:59:57.041465 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T12:59:57Z","lastTransitionTime":"2025-10-02T12:59:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 12:59:57 crc kubenswrapper[4724]: E1002 12:59:57.057264 4724 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T12:59:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T12:59:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:57Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T12:59:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T12:59:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:57Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"1650a7ad-7086-4fb1-9f9b-b9368db16424\\\",\\\"systemUUID\\\":\\\"5e560baf-345b-4d65-984c-1cfbf6a74dd2\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:57Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:57 crc kubenswrapper[4724]: I1002 12:59:57.061402 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 12:59:57 crc kubenswrapper[4724]: I1002 12:59:57.061444 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 12:59:57 crc kubenswrapper[4724]: I1002 12:59:57.061459 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 12:59:57 crc kubenswrapper[4724]: I1002 12:59:57.061478 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 12:59:57 crc kubenswrapper[4724]: I1002 12:59:57.061494 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T12:59:57Z","lastTransitionTime":"2025-10-02T12:59:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 12:59:57 crc kubenswrapper[4724]: E1002 12:59:57.075320 4724 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T12:59:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T12:59:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:57Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T12:59:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T12:59:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:57Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"1650a7ad-7086-4fb1-9f9b-b9368db16424\\\",\\\"systemUUID\\\":\\\"5e560baf-345b-4d65-984c-1cfbf6a74dd2\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:57Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:57 crc kubenswrapper[4724]: I1002 12:59:57.078783 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 12:59:57 crc kubenswrapper[4724]: I1002 12:59:57.078821 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 12:59:57 crc kubenswrapper[4724]: I1002 12:59:57.078833 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 12:59:57 crc kubenswrapper[4724]: I1002 12:59:57.078851 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 12:59:57 crc kubenswrapper[4724]: I1002 12:59:57.078862 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T12:59:57Z","lastTransitionTime":"2025-10-02T12:59:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 12:59:57 crc kubenswrapper[4724]: E1002 12:59:57.129574 4724 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T12:59:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T12:59:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:57Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T12:59:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T12:59:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:57Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"1650a7ad-7086-4fb1-9f9b-b9368db16424\\\",\\\"systemUUID\\\":\\\"5e560baf-345b-4d65-984c-1cfbf6a74dd2\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T12:59:57Z is after 2025-08-24T17:21:41Z" Oct 02 12:59:57 crc kubenswrapper[4724]: E1002 12:59:57.129724 4724 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Oct 02 12:59:57 crc kubenswrapper[4724]: I1002 12:59:57.131761 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 12:59:57 crc kubenswrapper[4724]: I1002 12:59:57.131806 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 12:59:57 crc kubenswrapper[4724]: I1002 12:59:57.131821 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 12:59:57 crc kubenswrapper[4724]: I1002 12:59:57.131843 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 12:59:57 crc kubenswrapper[4724]: I1002 12:59:57.131858 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T12:59:57Z","lastTransitionTime":"2025-10-02T12:59:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 12:59:57 crc kubenswrapper[4724]: I1002 12:59:57.234557 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 12:59:57 crc kubenswrapper[4724]: I1002 12:59:57.234627 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 12:59:57 crc kubenswrapper[4724]: I1002 12:59:57.234647 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 12:59:57 crc kubenswrapper[4724]: I1002 12:59:57.234675 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 12:59:57 crc kubenswrapper[4724]: I1002 12:59:57.234693 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T12:59:57Z","lastTransitionTime":"2025-10-02T12:59:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 12:59:57 crc kubenswrapper[4724]: I1002 12:59:57.313160 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 12:59:57 crc kubenswrapper[4724]: I1002 12:59:57.313201 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-q7t2t" Oct 02 12:59:57 crc kubenswrapper[4724]: I1002 12:59:57.313161 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 12:59:57 crc kubenswrapper[4724]: E1002 12:59:57.313278 4724 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 02 12:59:57 crc kubenswrapper[4724]: I1002 12:59:57.313160 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 12:59:57 crc kubenswrapper[4724]: E1002 12:59:57.313384 4724 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 02 12:59:57 crc kubenswrapper[4724]: E1002 12:59:57.313434 4724 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 02 12:59:57 crc kubenswrapper[4724]: E1002 12:59:57.313494 4724 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-q7t2t" podUID="32e04071-6b34-4fc0-9783-f346a72fcf99" Oct 02 12:59:57 crc kubenswrapper[4724]: I1002 12:59:57.336991 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 12:59:57 crc kubenswrapper[4724]: I1002 12:59:57.337023 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 12:59:57 crc kubenswrapper[4724]: I1002 12:59:57.337032 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 12:59:57 crc kubenswrapper[4724]: I1002 12:59:57.337047 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 12:59:57 crc kubenswrapper[4724]: I1002 12:59:57.337058 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T12:59:57Z","lastTransitionTime":"2025-10-02T12:59:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 12:59:57 crc kubenswrapper[4724]: I1002 12:59:57.440799 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 12:59:57 crc kubenswrapper[4724]: I1002 12:59:57.440853 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 12:59:57 crc kubenswrapper[4724]: I1002 12:59:57.440870 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 12:59:57 crc kubenswrapper[4724]: I1002 12:59:57.440891 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 12:59:57 crc kubenswrapper[4724]: I1002 12:59:57.440908 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T12:59:57Z","lastTransitionTime":"2025-10-02T12:59:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 12:59:57 crc kubenswrapper[4724]: I1002 12:59:57.552939 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 12:59:57 crc kubenswrapper[4724]: I1002 12:59:57.552985 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 12:59:57 crc kubenswrapper[4724]: I1002 12:59:57.552997 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 12:59:57 crc kubenswrapper[4724]: I1002 12:59:57.553016 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 12:59:57 crc kubenswrapper[4724]: I1002 12:59:57.553029 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T12:59:57Z","lastTransitionTime":"2025-10-02T12:59:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 12:59:57 crc kubenswrapper[4724]: I1002 12:59:57.655810 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 12:59:57 crc kubenswrapper[4724]: I1002 12:59:57.655876 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 12:59:57 crc kubenswrapper[4724]: I1002 12:59:57.655886 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 12:59:57 crc kubenswrapper[4724]: I1002 12:59:57.655903 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 12:59:57 crc kubenswrapper[4724]: I1002 12:59:57.655914 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T12:59:57Z","lastTransitionTime":"2025-10-02T12:59:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 12:59:57 crc kubenswrapper[4724]: I1002 12:59:57.758814 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 12:59:57 crc kubenswrapper[4724]: I1002 12:59:57.758875 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 12:59:57 crc kubenswrapper[4724]: I1002 12:59:57.758885 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 12:59:57 crc kubenswrapper[4724]: I1002 12:59:57.758904 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 12:59:57 crc kubenswrapper[4724]: I1002 12:59:57.758919 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T12:59:57Z","lastTransitionTime":"2025-10-02T12:59:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 12:59:57 crc kubenswrapper[4724]: I1002 12:59:57.861727 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 12:59:57 crc kubenswrapper[4724]: I1002 12:59:57.861773 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 12:59:57 crc kubenswrapper[4724]: I1002 12:59:57.861807 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 12:59:57 crc kubenswrapper[4724]: I1002 12:59:57.861825 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 12:59:57 crc kubenswrapper[4724]: I1002 12:59:57.861838 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T12:59:57Z","lastTransitionTime":"2025-10-02T12:59:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 12:59:57 crc kubenswrapper[4724]: I1002 12:59:57.964816 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 12:59:57 crc kubenswrapper[4724]: I1002 12:59:57.964863 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 12:59:57 crc kubenswrapper[4724]: I1002 12:59:57.964872 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 12:59:57 crc kubenswrapper[4724]: I1002 12:59:57.964889 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 12:59:57 crc kubenswrapper[4724]: I1002 12:59:57.964908 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T12:59:57Z","lastTransitionTime":"2025-10-02T12:59:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 12:59:58 crc kubenswrapper[4724]: I1002 12:59:58.067977 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 12:59:58 crc kubenswrapper[4724]: I1002 12:59:58.068045 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 12:59:58 crc kubenswrapper[4724]: I1002 12:59:58.068088 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 12:59:58 crc kubenswrapper[4724]: I1002 12:59:58.068118 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 12:59:58 crc kubenswrapper[4724]: I1002 12:59:58.068139 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T12:59:58Z","lastTransitionTime":"2025-10-02T12:59:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 12:59:58 crc kubenswrapper[4724]: I1002 12:59:58.170789 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 12:59:58 crc kubenswrapper[4724]: I1002 12:59:58.170863 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 12:59:58 crc kubenswrapper[4724]: I1002 12:59:58.170881 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 12:59:58 crc kubenswrapper[4724]: I1002 12:59:58.170909 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 12:59:58 crc kubenswrapper[4724]: I1002 12:59:58.170929 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T12:59:58Z","lastTransitionTime":"2025-10-02T12:59:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 12:59:58 crc kubenswrapper[4724]: I1002 12:59:58.273942 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 12:59:58 crc kubenswrapper[4724]: I1002 12:59:58.273984 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 12:59:58 crc kubenswrapper[4724]: I1002 12:59:58.273995 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 12:59:58 crc kubenswrapper[4724]: I1002 12:59:58.274012 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 12:59:58 crc kubenswrapper[4724]: I1002 12:59:58.274029 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T12:59:58Z","lastTransitionTime":"2025-10-02T12:59:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 12:59:58 crc kubenswrapper[4724]: I1002 12:59:58.376890 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 12:59:58 crc kubenswrapper[4724]: I1002 12:59:58.376945 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 12:59:58 crc kubenswrapper[4724]: I1002 12:59:58.376955 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 12:59:58 crc kubenswrapper[4724]: I1002 12:59:58.376974 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 12:59:58 crc kubenswrapper[4724]: I1002 12:59:58.376985 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T12:59:58Z","lastTransitionTime":"2025-10-02T12:59:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 12:59:58 crc kubenswrapper[4724]: I1002 12:59:58.480217 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 12:59:58 crc kubenswrapper[4724]: I1002 12:59:58.480299 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 12:59:58 crc kubenswrapper[4724]: I1002 12:59:58.480318 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 12:59:58 crc kubenswrapper[4724]: I1002 12:59:58.480348 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 12:59:58 crc kubenswrapper[4724]: I1002 12:59:58.480373 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T12:59:58Z","lastTransitionTime":"2025-10-02T12:59:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 12:59:58 crc kubenswrapper[4724]: I1002 12:59:58.583835 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 12:59:58 crc kubenswrapper[4724]: I1002 12:59:58.583934 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 12:59:58 crc kubenswrapper[4724]: I1002 12:59:58.583948 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 12:59:58 crc kubenswrapper[4724]: I1002 12:59:58.583972 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 12:59:58 crc kubenswrapper[4724]: I1002 12:59:58.583984 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T12:59:58Z","lastTransitionTime":"2025-10-02T12:59:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 12:59:58 crc kubenswrapper[4724]: I1002 12:59:58.686919 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 12:59:58 crc kubenswrapper[4724]: I1002 12:59:58.686964 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 12:59:58 crc kubenswrapper[4724]: I1002 12:59:58.686973 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 12:59:58 crc kubenswrapper[4724]: I1002 12:59:58.686996 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 12:59:58 crc kubenswrapper[4724]: I1002 12:59:58.687018 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T12:59:58Z","lastTransitionTime":"2025-10-02T12:59:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 12:59:58 crc kubenswrapper[4724]: I1002 12:59:58.790231 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 12:59:58 crc kubenswrapper[4724]: I1002 12:59:58.790267 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 12:59:58 crc kubenswrapper[4724]: I1002 12:59:58.790275 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 12:59:58 crc kubenswrapper[4724]: I1002 12:59:58.790294 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 12:59:58 crc kubenswrapper[4724]: I1002 12:59:58.790308 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T12:59:58Z","lastTransitionTime":"2025-10-02T12:59:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 12:59:58 crc kubenswrapper[4724]: I1002 12:59:58.893794 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 12:59:58 crc kubenswrapper[4724]: I1002 12:59:58.893839 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 12:59:58 crc kubenswrapper[4724]: I1002 12:59:58.893853 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 12:59:58 crc kubenswrapper[4724]: I1002 12:59:58.893872 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 12:59:58 crc kubenswrapper[4724]: I1002 12:59:58.893886 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T12:59:58Z","lastTransitionTime":"2025-10-02T12:59:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 12:59:58 crc kubenswrapper[4724]: I1002 12:59:58.996742 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 12:59:58 crc kubenswrapper[4724]: I1002 12:59:58.996820 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 12:59:58 crc kubenswrapper[4724]: I1002 12:59:58.996832 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 12:59:58 crc kubenswrapper[4724]: I1002 12:59:58.996852 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 12:59:58 crc kubenswrapper[4724]: I1002 12:59:58.996869 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T12:59:58Z","lastTransitionTime":"2025-10-02T12:59:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 12:59:59 crc kubenswrapper[4724]: I1002 12:59:59.099915 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 12:59:59 crc kubenswrapper[4724]: I1002 12:59:59.099961 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 12:59:59 crc kubenswrapper[4724]: I1002 12:59:59.099973 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 12:59:59 crc kubenswrapper[4724]: I1002 12:59:59.100002 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 12:59:59 crc kubenswrapper[4724]: I1002 12:59:59.100015 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T12:59:59Z","lastTransitionTime":"2025-10-02T12:59:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 12:59:59 crc kubenswrapper[4724]: I1002 12:59:59.203130 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 12:59:59 crc kubenswrapper[4724]: I1002 12:59:59.203188 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 12:59:59 crc kubenswrapper[4724]: I1002 12:59:59.203207 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 12:59:59 crc kubenswrapper[4724]: I1002 12:59:59.203234 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 12:59:59 crc kubenswrapper[4724]: I1002 12:59:59.203250 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T12:59:59Z","lastTransitionTime":"2025-10-02T12:59:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 12:59:59 crc kubenswrapper[4724]: I1002 12:59:59.305291 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 12:59:59 crc kubenswrapper[4724]: I1002 12:59:59.305333 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 12:59:59 crc kubenswrapper[4724]: I1002 12:59:59.305344 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 12:59:59 crc kubenswrapper[4724]: I1002 12:59:59.305361 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 12:59:59 crc kubenswrapper[4724]: I1002 12:59:59.305374 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T12:59:59Z","lastTransitionTime":"2025-10-02T12:59:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 12:59:59 crc kubenswrapper[4724]: I1002 12:59:59.313580 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-q7t2t" Oct 02 12:59:59 crc kubenswrapper[4724]: I1002 12:59:59.313593 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 12:59:59 crc kubenswrapper[4724]: I1002 12:59:59.313615 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 12:59:59 crc kubenswrapper[4724]: I1002 12:59:59.313580 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 12:59:59 crc kubenswrapper[4724]: E1002 12:59:59.313702 4724 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-q7t2t" podUID="32e04071-6b34-4fc0-9783-f346a72fcf99" Oct 02 12:59:59 crc kubenswrapper[4724]: E1002 12:59:59.313786 4724 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 02 12:59:59 crc kubenswrapper[4724]: E1002 12:59:59.314077 4724 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 02 12:59:59 crc kubenswrapper[4724]: E1002 12:59:59.314174 4724 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 02 12:59:59 crc kubenswrapper[4724]: I1002 12:59:59.410358 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 12:59:59 crc kubenswrapper[4724]: I1002 12:59:59.411156 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 12:59:59 crc kubenswrapper[4724]: I1002 12:59:59.411245 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 12:59:59 crc kubenswrapper[4724]: I1002 12:59:59.411359 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 12:59:59 crc kubenswrapper[4724]: I1002 12:59:59.411501 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T12:59:59Z","lastTransitionTime":"2025-10-02T12:59:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 12:59:59 crc kubenswrapper[4724]: I1002 12:59:59.513744 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 12:59:59 crc kubenswrapper[4724]: I1002 12:59:59.514081 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 12:59:59 crc kubenswrapper[4724]: I1002 12:59:59.514158 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 12:59:59 crc kubenswrapper[4724]: I1002 12:59:59.514225 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 12:59:59 crc kubenswrapper[4724]: I1002 12:59:59.514313 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T12:59:59Z","lastTransitionTime":"2025-10-02T12:59:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 12:59:59 crc kubenswrapper[4724]: I1002 12:59:59.616932 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 12:59:59 crc kubenswrapper[4724]: I1002 12:59:59.617011 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 12:59:59 crc kubenswrapper[4724]: I1002 12:59:59.617023 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 12:59:59 crc kubenswrapper[4724]: I1002 12:59:59.617041 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 12:59:59 crc kubenswrapper[4724]: I1002 12:59:59.617055 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T12:59:59Z","lastTransitionTime":"2025-10-02T12:59:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 12:59:59 crc kubenswrapper[4724]: I1002 12:59:59.720033 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 12:59:59 crc kubenswrapper[4724]: I1002 12:59:59.720082 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 12:59:59 crc kubenswrapper[4724]: I1002 12:59:59.720092 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 12:59:59 crc kubenswrapper[4724]: I1002 12:59:59.720115 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 12:59:59 crc kubenswrapper[4724]: I1002 12:59:59.720133 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T12:59:59Z","lastTransitionTime":"2025-10-02T12:59:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 12:59:59 crc kubenswrapper[4724]: I1002 12:59:59.823245 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 12:59:59 crc kubenswrapper[4724]: I1002 12:59:59.823288 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 12:59:59 crc kubenswrapper[4724]: I1002 12:59:59.823300 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 12:59:59 crc kubenswrapper[4724]: I1002 12:59:59.823317 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 12:59:59 crc kubenswrapper[4724]: I1002 12:59:59.823328 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T12:59:59Z","lastTransitionTime":"2025-10-02T12:59:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 12:59:59 crc kubenswrapper[4724]: I1002 12:59:59.925998 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 12:59:59 crc kubenswrapper[4724]: I1002 12:59:59.926054 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 12:59:59 crc kubenswrapper[4724]: I1002 12:59:59.926067 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 12:59:59 crc kubenswrapper[4724]: I1002 12:59:59.926090 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 12:59:59 crc kubenswrapper[4724]: I1002 12:59:59.926104 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T12:59:59Z","lastTransitionTime":"2025-10-02T12:59:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 13:00:00 crc kubenswrapper[4724]: I1002 13:00:00.028979 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 13:00:00 crc kubenswrapper[4724]: I1002 13:00:00.029022 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 13:00:00 crc kubenswrapper[4724]: I1002 13:00:00.029034 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 13:00:00 crc kubenswrapper[4724]: I1002 13:00:00.029052 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 13:00:00 crc kubenswrapper[4724]: I1002 13:00:00.029063 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T13:00:00Z","lastTransitionTime":"2025-10-02T13:00:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 13:00:00 crc kubenswrapper[4724]: I1002 13:00:00.134892 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 13:00:00 crc kubenswrapper[4724]: I1002 13:00:00.134963 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 13:00:00 crc kubenswrapper[4724]: I1002 13:00:00.134974 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 13:00:00 crc kubenswrapper[4724]: I1002 13:00:00.135016 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 13:00:00 crc kubenswrapper[4724]: I1002 13:00:00.135033 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T13:00:00Z","lastTransitionTime":"2025-10-02T13:00:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 13:00:00 crc kubenswrapper[4724]: I1002 13:00:00.238236 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 13:00:00 crc kubenswrapper[4724]: I1002 13:00:00.238287 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 13:00:00 crc kubenswrapper[4724]: I1002 13:00:00.238318 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 13:00:00 crc kubenswrapper[4724]: I1002 13:00:00.238337 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 13:00:00 crc kubenswrapper[4724]: I1002 13:00:00.238350 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T13:00:00Z","lastTransitionTime":"2025-10-02T13:00:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 13:00:00 crc kubenswrapper[4724]: I1002 13:00:00.342558 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 13:00:00 crc kubenswrapper[4724]: I1002 13:00:00.342685 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 13:00:00 crc kubenswrapper[4724]: I1002 13:00:00.342704 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 13:00:00 crc kubenswrapper[4724]: I1002 13:00:00.342747 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 13:00:00 crc kubenswrapper[4724]: I1002 13:00:00.342757 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T13:00:00Z","lastTransitionTime":"2025-10-02T13:00:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 13:00:00 crc kubenswrapper[4724]: I1002 13:00:00.445595 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 13:00:00 crc kubenswrapper[4724]: I1002 13:00:00.445655 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 13:00:00 crc kubenswrapper[4724]: I1002 13:00:00.445665 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 13:00:00 crc kubenswrapper[4724]: I1002 13:00:00.445683 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 13:00:00 crc kubenswrapper[4724]: I1002 13:00:00.445695 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T13:00:00Z","lastTransitionTime":"2025-10-02T13:00:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 13:00:00 crc kubenswrapper[4724]: I1002 13:00:00.549293 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 13:00:00 crc kubenswrapper[4724]: I1002 13:00:00.549376 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 13:00:00 crc kubenswrapper[4724]: I1002 13:00:00.549385 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 13:00:00 crc kubenswrapper[4724]: I1002 13:00:00.549404 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 13:00:00 crc kubenswrapper[4724]: I1002 13:00:00.549421 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T13:00:00Z","lastTransitionTime":"2025-10-02T13:00:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 13:00:00 crc kubenswrapper[4724]: I1002 13:00:00.653153 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 13:00:00 crc kubenswrapper[4724]: I1002 13:00:00.653464 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 13:00:00 crc kubenswrapper[4724]: I1002 13:00:00.653550 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 13:00:00 crc kubenswrapper[4724]: I1002 13:00:00.653676 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 13:00:00 crc kubenswrapper[4724]: I1002 13:00:00.653749 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T13:00:00Z","lastTransitionTime":"2025-10-02T13:00:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 13:00:00 crc kubenswrapper[4724]: I1002 13:00:00.756714 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 13:00:00 crc kubenswrapper[4724]: I1002 13:00:00.756766 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 13:00:00 crc kubenswrapper[4724]: I1002 13:00:00.756775 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 13:00:00 crc kubenswrapper[4724]: I1002 13:00:00.756792 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 13:00:00 crc kubenswrapper[4724]: I1002 13:00:00.756803 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T13:00:00Z","lastTransitionTime":"2025-10-02T13:00:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 13:00:00 crc kubenswrapper[4724]: I1002 13:00:00.859150 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 13:00:00 crc kubenswrapper[4724]: I1002 13:00:00.859199 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 13:00:00 crc kubenswrapper[4724]: I1002 13:00:00.859216 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 13:00:00 crc kubenswrapper[4724]: I1002 13:00:00.859240 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 13:00:00 crc kubenswrapper[4724]: I1002 13:00:00.859256 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T13:00:00Z","lastTransitionTime":"2025-10-02T13:00:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 13:00:00 crc kubenswrapper[4724]: I1002 13:00:00.961914 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 13:00:00 crc kubenswrapper[4724]: I1002 13:00:00.961972 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 13:00:00 crc kubenswrapper[4724]: I1002 13:00:00.961989 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 13:00:00 crc kubenswrapper[4724]: I1002 13:00:00.962016 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 13:00:00 crc kubenswrapper[4724]: I1002 13:00:00.962037 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T13:00:00Z","lastTransitionTime":"2025-10-02T13:00:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 13:00:01 crc kubenswrapper[4724]: I1002 13:00:01.065155 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 13:00:01 crc kubenswrapper[4724]: I1002 13:00:01.065566 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 13:00:01 crc kubenswrapper[4724]: I1002 13:00:01.065654 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 13:00:01 crc kubenswrapper[4724]: I1002 13:00:01.065737 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 13:00:01 crc kubenswrapper[4724]: I1002 13:00:01.065812 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T13:00:01Z","lastTransitionTime":"2025-10-02T13:00:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 13:00:01 crc kubenswrapper[4724]: I1002 13:00:01.171003 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 13:00:01 crc kubenswrapper[4724]: I1002 13:00:01.171044 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 13:00:01 crc kubenswrapper[4724]: I1002 13:00:01.171053 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 13:00:01 crc kubenswrapper[4724]: I1002 13:00:01.171070 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 13:00:01 crc kubenswrapper[4724]: I1002 13:00:01.171080 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T13:00:01Z","lastTransitionTime":"2025-10-02T13:00:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 13:00:01 crc kubenswrapper[4724]: I1002 13:00:01.273399 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 13:00:01 crc kubenswrapper[4724]: I1002 13:00:01.273437 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 13:00:01 crc kubenswrapper[4724]: I1002 13:00:01.273447 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 13:00:01 crc kubenswrapper[4724]: I1002 13:00:01.273465 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 13:00:01 crc kubenswrapper[4724]: I1002 13:00:01.273476 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T13:00:01Z","lastTransitionTime":"2025-10-02T13:00:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 13:00:01 crc kubenswrapper[4724]: I1002 13:00:01.312657 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 13:00:01 crc kubenswrapper[4724]: I1002 13:00:01.312738 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-q7t2t" Oct 02 13:00:01 crc kubenswrapper[4724]: I1002 13:00:01.312838 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 13:00:01 crc kubenswrapper[4724]: E1002 13:00:01.312962 4724 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-q7t2t" podUID="32e04071-6b34-4fc0-9783-f346a72fcf99" Oct 02 13:00:01 crc kubenswrapper[4724]: E1002 13:00:01.312823 4724 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 02 13:00:01 crc kubenswrapper[4724]: I1002 13:00:01.313030 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 13:00:01 crc kubenswrapper[4724]: E1002 13:00:01.313087 4724 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 02 13:00:01 crc kubenswrapper[4724]: E1002 13:00:01.313143 4724 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 02 13:00:01 crc kubenswrapper[4724]: I1002 13:00:01.376475 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 13:00:01 crc kubenswrapper[4724]: I1002 13:00:01.376772 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 13:00:01 crc kubenswrapper[4724]: I1002 13:00:01.376879 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 13:00:01 crc kubenswrapper[4724]: I1002 13:00:01.376975 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 13:00:01 crc kubenswrapper[4724]: I1002 13:00:01.377113 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T13:00:01Z","lastTransitionTime":"2025-10-02T13:00:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 13:00:01 crc kubenswrapper[4724]: I1002 13:00:01.480082 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 13:00:01 crc kubenswrapper[4724]: I1002 13:00:01.480162 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 13:00:01 crc kubenswrapper[4724]: I1002 13:00:01.480187 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 13:00:01 crc kubenswrapper[4724]: I1002 13:00:01.480219 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 13:00:01 crc kubenswrapper[4724]: I1002 13:00:01.480244 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T13:00:01Z","lastTransitionTime":"2025-10-02T13:00:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 13:00:01 crc kubenswrapper[4724]: I1002 13:00:01.583189 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 13:00:01 crc kubenswrapper[4724]: I1002 13:00:01.583232 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 13:00:01 crc kubenswrapper[4724]: I1002 13:00:01.583263 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 13:00:01 crc kubenswrapper[4724]: I1002 13:00:01.583282 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 13:00:01 crc kubenswrapper[4724]: I1002 13:00:01.583295 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T13:00:01Z","lastTransitionTime":"2025-10-02T13:00:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 13:00:01 crc kubenswrapper[4724]: I1002 13:00:01.686072 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 13:00:01 crc kubenswrapper[4724]: I1002 13:00:01.686121 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 13:00:01 crc kubenswrapper[4724]: I1002 13:00:01.686130 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 13:00:01 crc kubenswrapper[4724]: I1002 13:00:01.686149 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 13:00:01 crc kubenswrapper[4724]: I1002 13:00:01.686160 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T13:00:01Z","lastTransitionTime":"2025-10-02T13:00:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 13:00:01 crc kubenswrapper[4724]: I1002 13:00:01.789336 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 13:00:01 crc kubenswrapper[4724]: I1002 13:00:01.789414 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 13:00:01 crc kubenswrapper[4724]: I1002 13:00:01.789427 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 13:00:01 crc kubenswrapper[4724]: I1002 13:00:01.789445 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 13:00:01 crc kubenswrapper[4724]: I1002 13:00:01.789457 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T13:00:01Z","lastTransitionTime":"2025-10-02T13:00:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 13:00:01 crc kubenswrapper[4724]: I1002 13:00:01.892673 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 13:00:01 crc kubenswrapper[4724]: I1002 13:00:01.892717 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 13:00:01 crc kubenswrapper[4724]: I1002 13:00:01.892729 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 13:00:01 crc kubenswrapper[4724]: I1002 13:00:01.892751 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 13:00:01 crc kubenswrapper[4724]: I1002 13:00:01.892765 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T13:00:01Z","lastTransitionTime":"2025-10-02T13:00:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 13:00:01 crc kubenswrapper[4724]: I1002 13:00:01.995407 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 13:00:01 crc kubenswrapper[4724]: I1002 13:00:01.995445 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 13:00:01 crc kubenswrapper[4724]: I1002 13:00:01.995453 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 13:00:01 crc kubenswrapper[4724]: I1002 13:00:01.995468 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 13:00:01 crc kubenswrapper[4724]: I1002 13:00:01.995478 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T13:00:01Z","lastTransitionTime":"2025-10-02T13:00:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 13:00:02 crc kubenswrapper[4724]: I1002 13:00:02.097710 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 13:00:02 crc kubenswrapper[4724]: I1002 13:00:02.097759 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 13:00:02 crc kubenswrapper[4724]: I1002 13:00:02.097769 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 13:00:02 crc kubenswrapper[4724]: I1002 13:00:02.097790 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 13:00:02 crc kubenswrapper[4724]: I1002 13:00:02.097801 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T13:00:02Z","lastTransitionTime":"2025-10-02T13:00:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 13:00:02 crc kubenswrapper[4724]: I1002 13:00:02.200717 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 13:00:02 crc kubenswrapper[4724]: I1002 13:00:02.200765 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 13:00:02 crc kubenswrapper[4724]: I1002 13:00:02.200774 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 13:00:02 crc kubenswrapper[4724]: I1002 13:00:02.200791 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 13:00:02 crc kubenswrapper[4724]: I1002 13:00:02.200802 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T13:00:02Z","lastTransitionTime":"2025-10-02T13:00:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 13:00:02 crc kubenswrapper[4724]: I1002 13:00:02.303508 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 13:00:02 crc kubenswrapper[4724]: I1002 13:00:02.303565 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 13:00:02 crc kubenswrapper[4724]: I1002 13:00:02.303576 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 13:00:02 crc kubenswrapper[4724]: I1002 13:00:02.303592 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 13:00:02 crc kubenswrapper[4724]: I1002 13:00:02.303608 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T13:00:02Z","lastTransitionTime":"2025-10-02T13:00:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 13:00:02 crc kubenswrapper[4724]: I1002 13:00:02.406577 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 13:00:02 crc kubenswrapper[4724]: I1002 13:00:02.406625 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 13:00:02 crc kubenswrapper[4724]: I1002 13:00:02.406637 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 13:00:02 crc kubenswrapper[4724]: I1002 13:00:02.406657 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 13:00:02 crc kubenswrapper[4724]: I1002 13:00:02.406671 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T13:00:02Z","lastTransitionTime":"2025-10-02T13:00:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 13:00:02 crc kubenswrapper[4724]: I1002 13:00:02.509349 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 13:00:02 crc kubenswrapper[4724]: I1002 13:00:02.509401 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 13:00:02 crc kubenswrapper[4724]: I1002 13:00:02.509426 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 13:00:02 crc kubenswrapper[4724]: I1002 13:00:02.509448 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 13:00:02 crc kubenswrapper[4724]: I1002 13:00:02.509462 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T13:00:02Z","lastTransitionTime":"2025-10-02T13:00:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 13:00:02 crc kubenswrapper[4724]: I1002 13:00:02.612288 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 13:00:02 crc kubenswrapper[4724]: I1002 13:00:02.612411 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 13:00:02 crc kubenswrapper[4724]: I1002 13:00:02.612428 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 13:00:02 crc kubenswrapper[4724]: I1002 13:00:02.612732 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 13:00:02 crc kubenswrapper[4724]: I1002 13:00:02.612746 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T13:00:02Z","lastTransitionTime":"2025-10-02T13:00:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 13:00:02 crc kubenswrapper[4724]: I1002 13:00:02.715047 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 13:00:02 crc kubenswrapper[4724]: I1002 13:00:02.715212 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 13:00:02 crc kubenswrapper[4724]: I1002 13:00:02.715239 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 13:00:02 crc kubenswrapper[4724]: I1002 13:00:02.715259 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 13:00:02 crc kubenswrapper[4724]: I1002 13:00:02.715274 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T13:00:02Z","lastTransitionTime":"2025-10-02T13:00:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 13:00:02 crc kubenswrapper[4724]: I1002 13:00:02.817624 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 13:00:02 crc kubenswrapper[4724]: I1002 13:00:02.817676 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 13:00:02 crc kubenswrapper[4724]: I1002 13:00:02.817687 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 13:00:02 crc kubenswrapper[4724]: I1002 13:00:02.817703 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 13:00:02 crc kubenswrapper[4724]: I1002 13:00:02.817714 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T13:00:02Z","lastTransitionTime":"2025-10-02T13:00:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 13:00:02 crc kubenswrapper[4724]: I1002 13:00:02.921062 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 13:00:02 crc kubenswrapper[4724]: I1002 13:00:02.921136 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 13:00:02 crc kubenswrapper[4724]: I1002 13:00:02.921156 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 13:00:02 crc kubenswrapper[4724]: I1002 13:00:02.921179 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 13:00:02 crc kubenswrapper[4724]: I1002 13:00:02.921191 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T13:00:02Z","lastTransitionTime":"2025-10-02T13:00:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 13:00:03 crc kubenswrapper[4724]: I1002 13:00:03.024245 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 13:00:03 crc kubenswrapper[4724]: I1002 13:00:03.024289 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 13:00:03 crc kubenswrapper[4724]: I1002 13:00:03.024301 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 13:00:03 crc kubenswrapper[4724]: I1002 13:00:03.024318 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 13:00:03 crc kubenswrapper[4724]: I1002 13:00:03.024331 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T13:00:03Z","lastTransitionTime":"2025-10-02T13:00:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 13:00:03 crc kubenswrapper[4724]: I1002 13:00:03.126886 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 13:00:03 crc kubenswrapper[4724]: I1002 13:00:03.126934 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 13:00:03 crc kubenswrapper[4724]: I1002 13:00:03.126947 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 13:00:03 crc kubenswrapper[4724]: I1002 13:00:03.126966 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 13:00:03 crc kubenswrapper[4724]: I1002 13:00:03.126981 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T13:00:03Z","lastTransitionTime":"2025-10-02T13:00:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 13:00:03 crc kubenswrapper[4724]: I1002 13:00:03.229797 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 13:00:03 crc kubenswrapper[4724]: I1002 13:00:03.229851 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 13:00:03 crc kubenswrapper[4724]: I1002 13:00:03.229864 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 13:00:03 crc kubenswrapper[4724]: I1002 13:00:03.229886 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 13:00:03 crc kubenswrapper[4724]: I1002 13:00:03.229901 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T13:00:03Z","lastTransitionTime":"2025-10-02T13:00:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 13:00:03 crc kubenswrapper[4724]: I1002 13:00:03.313329 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 13:00:03 crc kubenswrapper[4724]: I1002 13:00:03.313386 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 13:00:03 crc kubenswrapper[4724]: I1002 13:00:03.313367 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-q7t2t" Oct 02 13:00:03 crc kubenswrapper[4724]: I1002 13:00:03.313337 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 13:00:03 crc kubenswrapper[4724]: E1002 13:00:03.313511 4724 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 02 13:00:03 crc kubenswrapper[4724]: E1002 13:00:03.313608 4724 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-q7t2t" podUID="32e04071-6b34-4fc0-9783-f346a72fcf99" Oct 02 13:00:03 crc kubenswrapper[4724]: E1002 13:00:03.313643 4724 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 02 13:00:03 crc kubenswrapper[4724]: E1002 13:00:03.313739 4724 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 02 13:00:03 crc kubenswrapper[4724]: I1002 13:00:03.333128 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 13:00:03 crc kubenswrapper[4724]: I1002 13:00:03.333182 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 13:00:03 crc kubenswrapper[4724]: I1002 13:00:03.333196 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 13:00:03 crc kubenswrapper[4724]: I1002 13:00:03.333219 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 13:00:03 crc kubenswrapper[4724]: I1002 13:00:03.333238 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T13:00:03Z","lastTransitionTime":"2025-10-02T13:00:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 13:00:03 crc kubenswrapper[4724]: I1002 13:00:03.435649 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 13:00:03 crc kubenswrapper[4724]: I1002 13:00:03.435697 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 13:00:03 crc kubenswrapper[4724]: I1002 13:00:03.435706 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 13:00:03 crc kubenswrapper[4724]: I1002 13:00:03.435721 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 13:00:03 crc kubenswrapper[4724]: I1002 13:00:03.435732 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T13:00:03Z","lastTransitionTime":"2025-10-02T13:00:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 13:00:03 crc kubenswrapper[4724]: I1002 13:00:03.538500 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 13:00:03 crc kubenswrapper[4724]: I1002 13:00:03.538561 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 13:00:03 crc kubenswrapper[4724]: I1002 13:00:03.538582 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 13:00:03 crc kubenswrapper[4724]: I1002 13:00:03.538598 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 13:00:03 crc kubenswrapper[4724]: I1002 13:00:03.538608 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T13:00:03Z","lastTransitionTime":"2025-10-02T13:00:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 13:00:03 crc kubenswrapper[4724]: I1002 13:00:03.642043 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 13:00:03 crc kubenswrapper[4724]: I1002 13:00:03.642075 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 13:00:03 crc kubenswrapper[4724]: I1002 13:00:03.642083 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 13:00:03 crc kubenswrapper[4724]: I1002 13:00:03.642101 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 13:00:03 crc kubenswrapper[4724]: I1002 13:00:03.642113 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T13:00:03Z","lastTransitionTime":"2025-10-02T13:00:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 13:00:03 crc kubenswrapper[4724]: I1002 13:00:03.744764 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 13:00:03 crc kubenswrapper[4724]: I1002 13:00:03.744819 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 13:00:03 crc kubenswrapper[4724]: I1002 13:00:03.744842 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 13:00:03 crc kubenswrapper[4724]: I1002 13:00:03.744862 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 13:00:03 crc kubenswrapper[4724]: I1002 13:00:03.744873 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T13:00:03Z","lastTransitionTime":"2025-10-02T13:00:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 13:00:03 crc kubenswrapper[4724]: I1002 13:00:03.847455 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 13:00:03 crc kubenswrapper[4724]: I1002 13:00:03.847491 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 13:00:03 crc kubenswrapper[4724]: I1002 13:00:03.847500 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 13:00:03 crc kubenswrapper[4724]: I1002 13:00:03.847516 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 13:00:03 crc kubenswrapper[4724]: I1002 13:00:03.847526 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T13:00:03Z","lastTransitionTime":"2025-10-02T13:00:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 13:00:03 crc kubenswrapper[4724]: I1002 13:00:03.950790 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 13:00:03 crc kubenswrapper[4724]: I1002 13:00:03.951181 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 13:00:03 crc kubenswrapper[4724]: I1002 13:00:03.951290 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 13:00:03 crc kubenswrapper[4724]: I1002 13:00:03.951455 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 13:00:03 crc kubenswrapper[4724]: I1002 13:00:03.951581 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T13:00:03Z","lastTransitionTime":"2025-10-02T13:00:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 13:00:04 crc kubenswrapper[4724]: I1002 13:00:04.054402 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 13:00:04 crc kubenswrapper[4724]: I1002 13:00:04.054747 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 13:00:04 crc kubenswrapper[4724]: I1002 13:00:04.054869 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 13:00:04 crc kubenswrapper[4724]: I1002 13:00:04.054947 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 13:00:04 crc kubenswrapper[4724]: I1002 13:00:04.055026 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T13:00:04Z","lastTransitionTime":"2025-10-02T13:00:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 13:00:04 crc kubenswrapper[4724]: I1002 13:00:04.158257 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 13:00:04 crc kubenswrapper[4724]: I1002 13:00:04.158565 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 13:00:04 crc kubenswrapper[4724]: I1002 13:00:04.158715 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 13:00:04 crc kubenswrapper[4724]: I1002 13:00:04.158810 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 13:00:04 crc kubenswrapper[4724]: I1002 13:00:04.158883 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T13:00:04Z","lastTransitionTime":"2025-10-02T13:00:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 13:00:04 crc kubenswrapper[4724]: I1002 13:00:04.263091 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 13:00:04 crc kubenswrapper[4724]: I1002 13:00:04.263209 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 13:00:04 crc kubenswrapper[4724]: I1002 13:00:04.263224 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 13:00:04 crc kubenswrapper[4724]: I1002 13:00:04.263252 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 13:00:04 crc kubenswrapper[4724]: I1002 13:00:04.263268 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T13:00:04Z","lastTransitionTime":"2025-10-02T13:00:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 13:00:04 crc kubenswrapper[4724]: I1002 13:00:04.366090 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 13:00:04 crc kubenswrapper[4724]: I1002 13:00:04.366129 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 13:00:04 crc kubenswrapper[4724]: I1002 13:00:04.366137 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 13:00:04 crc kubenswrapper[4724]: I1002 13:00:04.366156 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 13:00:04 crc kubenswrapper[4724]: I1002 13:00:04.366165 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T13:00:04Z","lastTransitionTime":"2025-10-02T13:00:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 13:00:04 crc kubenswrapper[4724]: I1002 13:00:04.469038 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 13:00:04 crc kubenswrapper[4724]: I1002 13:00:04.469084 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 13:00:04 crc kubenswrapper[4724]: I1002 13:00:04.469099 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 13:00:04 crc kubenswrapper[4724]: I1002 13:00:04.469125 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 13:00:04 crc kubenswrapper[4724]: I1002 13:00:04.469145 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T13:00:04Z","lastTransitionTime":"2025-10-02T13:00:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 13:00:04 crc kubenswrapper[4724]: I1002 13:00:04.573605 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 13:00:04 crc kubenswrapper[4724]: I1002 13:00:04.573898 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 13:00:04 crc kubenswrapper[4724]: I1002 13:00:04.573908 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 13:00:04 crc kubenswrapper[4724]: I1002 13:00:04.573929 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 13:00:04 crc kubenswrapper[4724]: I1002 13:00:04.573941 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T13:00:04Z","lastTransitionTime":"2025-10-02T13:00:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 13:00:04 crc kubenswrapper[4724]: I1002 13:00:04.676830 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 13:00:04 crc kubenswrapper[4724]: I1002 13:00:04.676888 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 13:00:04 crc kubenswrapper[4724]: I1002 13:00:04.676902 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 13:00:04 crc kubenswrapper[4724]: I1002 13:00:04.676919 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 13:00:04 crc kubenswrapper[4724]: I1002 13:00:04.676930 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T13:00:04Z","lastTransitionTime":"2025-10-02T13:00:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 13:00:04 crc kubenswrapper[4724]: I1002 13:00:04.779840 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 13:00:04 crc kubenswrapper[4724]: I1002 13:00:04.779884 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 13:00:04 crc kubenswrapper[4724]: I1002 13:00:04.779893 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 13:00:04 crc kubenswrapper[4724]: I1002 13:00:04.779909 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 13:00:04 crc kubenswrapper[4724]: I1002 13:00:04.779921 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T13:00:04Z","lastTransitionTime":"2025-10-02T13:00:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 13:00:04 crc kubenswrapper[4724]: I1002 13:00:04.882606 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 13:00:04 crc kubenswrapper[4724]: I1002 13:00:04.882678 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 13:00:04 crc kubenswrapper[4724]: I1002 13:00:04.882690 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 13:00:04 crc kubenswrapper[4724]: I1002 13:00:04.882708 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 13:00:04 crc kubenswrapper[4724]: I1002 13:00:04.882722 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T13:00:04Z","lastTransitionTime":"2025-10-02T13:00:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 13:00:04 crc kubenswrapper[4724]: I1002 13:00:04.985778 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 13:00:04 crc kubenswrapper[4724]: I1002 13:00:04.985848 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 13:00:04 crc kubenswrapper[4724]: I1002 13:00:04.985861 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 13:00:04 crc kubenswrapper[4724]: I1002 13:00:04.985878 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 13:00:04 crc kubenswrapper[4724]: I1002 13:00:04.985893 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T13:00:04Z","lastTransitionTime":"2025-10-02T13:00:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 13:00:05 crc kubenswrapper[4724]: I1002 13:00:05.089113 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 13:00:05 crc kubenswrapper[4724]: I1002 13:00:05.089154 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 13:00:05 crc kubenswrapper[4724]: I1002 13:00:05.089164 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 13:00:05 crc kubenswrapper[4724]: I1002 13:00:05.089180 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 13:00:05 crc kubenswrapper[4724]: I1002 13:00:05.089191 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T13:00:05Z","lastTransitionTime":"2025-10-02T13:00:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 13:00:05 crc kubenswrapper[4724]: I1002 13:00:05.191688 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 13:00:05 crc kubenswrapper[4724]: I1002 13:00:05.191760 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 13:00:05 crc kubenswrapper[4724]: I1002 13:00:05.191772 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 13:00:05 crc kubenswrapper[4724]: I1002 13:00:05.191793 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 13:00:05 crc kubenswrapper[4724]: I1002 13:00:05.191810 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T13:00:05Z","lastTransitionTime":"2025-10-02T13:00:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 13:00:05 crc kubenswrapper[4724]: I1002 13:00:05.294626 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 13:00:05 crc kubenswrapper[4724]: I1002 13:00:05.294667 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 13:00:05 crc kubenswrapper[4724]: I1002 13:00:05.294681 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 13:00:05 crc kubenswrapper[4724]: I1002 13:00:05.294699 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 13:00:05 crc kubenswrapper[4724]: I1002 13:00:05.294714 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T13:00:05Z","lastTransitionTime":"2025-10-02T13:00:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 13:00:05 crc kubenswrapper[4724]: I1002 13:00:05.313352 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 13:00:05 crc kubenswrapper[4724]: I1002 13:00:05.313409 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 13:00:05 crc kubenswrapper[4724]: I1002 13:00:05.313402 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-q7t2t" Oct 02 13:00:05 crc kubenswrapper[4724]: I1002 13:00:05.313352 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 13:00:05 crc kubenswrapper[4724]: E1002 13:00:05.313505 4724 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 02 13:00:05 crc kubenswrapper[4724]: E1002 13:00:05.313649 4724 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 02 13:00:05 crc kubenswrapper[4724]: E1002 13:00:05.313813 4724 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 02 13:00:05 crc kubenswrapper[4724]: E1002 13:00:05.313855 4724 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-q7t2t" podUID="32e04071-6b34-4fc0-9783-f346a72fcf99" Oct 02 13:00:05 crc kubenswrapper[4724]: I1002 13:00:05.397094 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 13:00:05 crc kubenswrapper[4724]: I1002 13:00:05.397137 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 13:00:05 crc kubenswrapper[4724]: I1002 13:00:05.397152 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 13:00:05 crc kubenswrapper[4724]: I1002 13:00:05.397173 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 13:00:05 crc kubenswrapper[4724]: I1002 13:00:05.397185 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T13:00:05Z","lastTransitionTime":"2025-10-02T13:00:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 13:00:05 crc kubenswrapper[4724]: I1002 13:00:05.500012 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 13:00:05 crc kubenswrapper[4724]: I1002 13:00:05.500082 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 13:00:05 crc kubenswrapper[4724]: I1002 13:00:05.500095 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 13:00:05 crc kubenswrapper[4724]: I1002 13:00:05.500113 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 13:00:05 crc kubenswrapper[4724]: I1002 13:00:05.500124 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T13:00:05Z","lastTransitionTime":"2025-10-02T13:00:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 13:00:05 crc kubenswrapper[4724]: I1002 13:00:05.602696 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 13:00:05 crc kubenswrapper[4724]: I1002 13:00:05.602746 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 13:00:05 crc kubenswrapper[4724]: I1002 13:00:05.602759 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 13:00:05 crc kubenswrapper[4724]: I1002 13:00:05.602778 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 13:00:05 crc kubenswrapper[4724]: I1002 13:00:05.602790 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T13:00:05Z","lastTransitionTime":"2025-10-02T13:00:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 13:00:05 crc kubenswrapper[4724]: I1002 13:00:05.705133 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 13:00:05 crc kubenswrapper[4724]: I1002 13:00:05.705165 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 13:00:05 crc kubenswrapper[4724]: I1002 13:00:05.705173 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 13:00:05 crc kubenswrapper[4724]: I1002 13:00:05.705187 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 13:00:05 crc kubenswrapper[4724]: I1002 13:00:05.705196 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T13:00:05Z","lastTransitionTime":"2025-10-02T13:00:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 13:00:05 crc kubenswrapper[4724]: I1002 13:00:05.807429 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 13:00:05 crc kubenswrapper[4724]: I1002 13:00:05.807503 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 13:00:05 crc kubenswrapper[4724]: I1002 13:00:05.807515 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 13:00:05 crc kubenswrapper[4724]: I1002 13:00:05.807548 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 13:00:05 crc kubenswrapper[4724]: I1002 13:00:05.807563 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T13:00:05Z","lastTransitionTime":"2025-10-02T13:00:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 13:00:05 crc kubenswrapper[4724]: I1002 13:00:05.910315 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 13:00:05 crc kubenswrapper[4724]: I1002 13:00:05.910351 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 13:00:05 crc kubenswrapper[4724]: I1002 13:00:05.910359 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 13:00:05 crc kubenswrapper[4724]: I1002 13:00:05.910373 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 13:00:05 crc kubenswrapper[4724]: I1002 13:00:05.910383 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T13:00:05Z","lastTransitionTime":"2025-10-02T13:00:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 13:00:06 crc kubenswrapper[4724]: I1002 13:00:06.013384 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 13:00:06 crc kubenswrapper[4724]: I1002 13:00:06.013428 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 13:00:06 crc kubenswrapper[4724]: I1002 13:00:06.013438 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 13:00:06 crc kubenswrapper[4724]: I1002 13:00:06.013454 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 13:00:06 crc kubenswrapper[4724]: I1002 13:00:06.013465 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T13:00:06Z","lastTransitionTime":"2025-10-02T13:00:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 13:00:06 crc kubenswrapper[4724]: I1002 13:00:06.116952 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 13:00:06 crc kubenswrapper[4724]: I1002 13:00:06.116997 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 13:00:06 crc kubenswrapper[4724]: I1002 13:00:06.117009 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 13:00:06 crc kubenswrapper[4724]: I1002 13:00:06.117029 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 13:00:06 crc kubenswrapper[4724]: I1002 13:00:06.117044 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T13:00:06Z","lastTransitionTime":"2025-10-02T13:00:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 13:00:06 crc kubenswrapper[4724]: I1002 13:00:06.219408 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 13:00:06 crc kubenswrapper[4724]: I1002 13:00:06.219452 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 13:00:06 crc kubenswrapper[4724]: I1002 13:00:06.219460 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 13:00:06 crc kubenswrapper[4724]: I1002 13:00:06.219477 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 13:00:06 crc kubenswrapper[4724]: I1002 13:00:06.219489 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T13:00:06Z","lastTransitionTime":"2025-10-02T13:00:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 13:00:06 crc kubenswrapper[4724]: I1002 13:00:06.322512 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 13:00:06 crc kubenswrapper[4724]: I1002 13:00:06.322581 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 13:00:06 crc kubenswrapper[4724]: I1002 13:00:06.322595 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 13:00:06 crc kubenswrapper[4724]: I1002 13:00:06.322615 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 13:00:06 crc kubenswrapper[4724]: I1002 13:00:06.322627 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T13:00:06Z","lastTransitionTime":"2025-10-02T13:00:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 13:00:06 crc kubenswrapper[4724]: I1002 13:00:06.332359 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1527f3c8-7fa1-404b-aafd-7cac512df49d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:58:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:58:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:58:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f058a4a1cf35b7f4800f872139322f2b0096d67548945ba03ecfe6775ffd5103\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:58:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c06801afff9aa03ae9e0bcd8a966f454d81fdfe876806eed22e9d93132989dae\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:58:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff4c45bdd946a4d50c7dc93752d3d1a71b10b830611b64e138a5e1c0eb9e5e30\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:58:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://70f8bb8f5d842643df282ff4c021f3aeba0de7dddc0864e3c1deaf64e604fccc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a3585f00b9d5a19821986d13b0d46f0db3bdd6eef3b4e28f3a63449e0aa0e55\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-02T12:59:19Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1002 12:59:13.975124 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1002 12:59:13.976423 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1894517521/tls.crt::/tmp/serving-cert-1894517521/tls.key\\\\\\\"\\\\nI1002 12:59:19.332110 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1002 12:59:19.335107 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1002 12:59:19.335132 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1002 12:59:19.335161 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1002 12:59:19.335167 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1002 12:59:19.339404 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1002 12:59:19.339433 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 12:59:19.339437 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 12:59:19.339442 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1002 12:59:19.339446 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1002 12:59:19.339452 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1002 12:59:19.339454 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1002 12:59:19.339646 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1002 12:59:19.342508 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T12:59:02Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://25e74bd1660931f7bb8dc824f985c539abcbd4c706a11edf7192876b8796e751\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:58:59Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://366ed0b9a29ba0b7606888087909b51b39abd60dd6b11e6509b8613913461b6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://366ed0b9a29ba0b7606888087909b51b39abd60dd6b11e6509b8613913461b6b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T12:58:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T12:58:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T12:58:56Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T13:00:06Z is after 2025-08-24T17:21:41Z" Oct 02 13:00:06 crc kubenswrapper[4724]: I1002 13:00:06.355746 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-w58lt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4089ad23-969c-4222-a8ed-e141ec291e80\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://197d101ac60dc4e22bd91f0bccaf0c5d9461d2bb94425b5c6d4ca75aad0f7e25\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sxqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3940d1762b165816101228f10689c4897ca3909b295c51368935e2d28d32e9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sxqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://300ea92580dcf51b685b378397b0dd067899d424f4394bce09a8242415acba64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sxqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6bc33e00e91c1177fe2762cb5295c78f4b00c1861d44446269e2e7668b25ac83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sxqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f4412d5281a9c03ea5b251d17371e363b9c5a970bad0db12adf3f75ddd1f955\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sxqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4186fbd2d282edbe581a4d8d3616dff86c3e6cdbe797999fa57f6034870e7742\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sxqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://28aaaa47f8700a0b8725c20b81e95579d97b1548baea9e9cea1fec062cd76f79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://28aaaa47f8700a0b8725c20b81e95579d97b1548baea9e9cea1fec062cd76f79\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-02T12:59:54Z\\\",\\\"message\\\":\\\"693e1320bd}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1002 12:59:54.044422 6454 obj_retry.go:303] Retry object setup: *v1.Pod openshift-network-console/networking-console-plugin-85b44fc459-gdk6g\\\\nI1002 12:59:54.044420 6454 obj_retry.go:303] Retry object setup: *v1.Pod openshift-network-diagnostics/network-check-source-55646444c4-trplf\\\\nI1002 12:59:54.044433 6454 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-console/networking-console-plugin-85b44fc459-gdk6g\\\\nI1002 12:59:54.044442 6454 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-diagnostics/network-check-source-55646444c4-trplf\\\\nF1002 12:59:54.044455 6454 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T12:59:52Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-w58lt_openshift-ovn-kubernetes(4089ad23-969c-4222-a8ed-e141ec291e80)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sxqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c6cd68ea9e76b6dcb1649fafb808953b30389fcaee674b611c655c239a36e7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sxqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9c7841b20007f0436754e669fb6f1a85de1f0ced1c0b2686ae0a20b56ab878f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9c7841b20007f0436754e669fb6f1a85de1f0ced1c0b2686ae0a20b56ab878f2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T12:59:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T12:59:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sxqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T12:59:21Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-w58lt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T13:00:06Z is after 2025-08-24T17:21:41Z" Oct 02 13:00:06 crc kubenswrapper[4724]: I1002 13:00:06.375278 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8902618f9e9502ff68d0658e199abb68025c551774c41022ed9f3f15dcccba47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T13:00:06Z is after 2025-08-24T17:21:41Z" Oct 02 13:00:06 crc kubenswrapper[4724]: I1002 13:00:06.392401 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T13:00:06Z is after 2025-08-24T17:21:41Z" Oct 02 13:00:06 crc kubenswrapper[4724]: I1002 13:00:06.406295 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:19Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T13:00:06Z is after 2025-08-24T17:21:41Z" Oct 02 13:00:06 crc kubenswrapper[4724]: I1002 13:00:06.421584 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-pr276" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c67e3c27-01fd-47f5-a7fb-a2ae1e7753d4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5fe9713cb9b004911140ef4802a6755a4d2a781fde439d7318ace93e4b608cef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c844q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T12:59:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-pr276\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T13:00:06Z is after 2025-08-24T17:21:41Z" Oct 02 13:00:06 crc kubenswrapper[4724]: I1002 13:00:06.424835 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 13:00:06 crc kubenswrapper[4724]: I1002 13:00:06.424877 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 13:00:06 crc kubenswrapper[4724]: I1002 13:00:06.424890 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 13:00:06 crc kubenswrapper[4724]: I1002 13:00:06.424909 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 13:00:06 crc kubenswrapper[4724]: I1002 13:00:06.424924 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T13:00:06Z","lastTransitionTime":"2025-10-02T13:00:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 13:00:06 crc kubenswrapper[4724]: I1002 13:00:06.435117 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-vv7gr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"58e6c559-83ff-48ec-b337-ddd00852bc3c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://540350e6cb29bb395f31338d08626d3b110cc3adbe0df070045d178281d9c035\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4spmh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T12:59:26Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-vv7gr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T13:00:06Z is after 2025-08-24T17:21:41Z" Oct 02 13:00:06 crc kubenswrapper[4724]: I1002 13:00:06.447673 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-z9692" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aeccba6f-b4bb-4dd5-8ab1-798d7a67251a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://116acf2f64bac5bb00b87c2bf09f16dcec811d4effbacb2ad49e4d2656180b54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znd9t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5971fadd3a2831fd568f1a0e53cefad9dbb4a9c5ad719b4024ec27381842f92e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znd9t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T12:59:33Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-z9692\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T13:00:06Z is after 2025-08-24T17:21:41Z" Oct 02 13:00:06 crc kubenswrapper[4724]: I1002 13:00:06.461482 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"92a0e520-2ed8-4d81-8de4-c0f98916b012\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:58:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:58:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:58:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8d0d24fc1f56963b40bca48e3e923a5b35f9bec48f0b4fabbbdc9163762b2aa6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:58:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cdf23aa5cabd12ba0af299a7184df0341b5a7f7fb6f8159a10c932599fcef08b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:58:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a087f1c52a561564ac405ef652e8cb1542077e507acff22e1189f7370b7ca322\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:58:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a675aca2218d07044b6e9a391f2b60f5cc822790011af97b0c0b704c1bd98d78\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:58:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T12:58:56Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T13:00:06Z is after 2025-08-24T17:21:41Z" Oct 02 13:00:06 crc kubenswrapper[4724]: I1002 13:00:06.474139 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-2mrjk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e0d209f5-ad55-48f5-b4de-51aa5a972c19\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e94572c741d70917793b2482faf4c7fba57dfc4a29ade8824b35109735b602c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8f48\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T12:59:21Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-2mrjk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T13:00:06Z is after 2025-08-24T17:21:41Z" Oct 02 13:00:06 crc kubenswrapper[4724]: I1002 13:00:06.487023 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7cb9bd1f-850c-4e5f-bcd0-cd9cdca06604\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:58:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:58:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:58:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4c3fabef61b20063aa2de1008b2b1fda94f39ddd4d0ad905d1a954955ffff7d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:58:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d7393e9d92fc40b10b30d788707b77d3c3e28fb075c618da4c3575dc0a1c1a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:58:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7459703a380feaeff882176dc352b69d3e13089a18d5da98bcebd41018d32091\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:58:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a8e76f0ac3479d575a0b073c5b9ad857711fc06bca31e147167babe0fb164614\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a8e76f0ac3479d575a0b073c5b9ad857711fc06bca31e147167babe0fb164614\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T12:58:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T12:58:57Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T12:58:56Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T13:00:06Z is after 2025-08-24T17:21:41Z" Oct 02 13:00:06 crc kubenswrapper[4724]: I1002 13:00:06.499368 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df76441bbfa05afa242a77e10e6f855d811a9432b790630a9fa5f359ccb6d457\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T13:00:06Z is after 2025-08-24T17:21:41Z" Oct 02 13:00:06 crc kubenswrapper[4724]: I1002 13:00:06.510926 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://16ca55ea7e67fb6219b1c835d960d1f9ec6a18a98789ee961e44a278fad9add7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://72eefb1c1aad15d7ad34aa4384c7d0a20a4c886225cb7d28f0597cf01ad35693\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T13:00:06Z is after 2025-08-24T17:21:41Z" Oct 02 13:00:06 crc kubenswrapper[4724]: I1002 13:00:06.523039 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:19Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T13:00:06Z is after 2025-08-24T17:21:41Z" Oct 02 13:00:06 crc kubenswrapper[4724]: I1002 13:00:06.527326 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 13:00:06 crc kubenswrapper[4724]: I1002 13:00:06.527365 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 13:00:06 crc kubenswrapper[4724]: I1002 13:00:06.527380 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 13:00:06 crc kubenswrapper[4724]: I1002 13:00:06.527400 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 13:00:06 crc kubenswrapper[4724]: I1002 13:00:06.527413 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T13:00:06Z","lastTransitionTime":"2025-10-02T13:00:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 13:00:06 crc kubenswrapper[4724]: I1002 13:00:06.538564 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-829dv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b90b9e02-3565-4ad5-8f8c-eec339fc499c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://12d1d4558bef1a0a974bf27f9547c646e216ed5001227ab980e89b05c2a7b5ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vjwcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://55a18158dd67ef4667e4a6bc9684c4f4824f848b6e151a0f99c73e5ecfcc5ab1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://55a18158dd67ef4667e4a6bc9684c4f4824f848b6e151a0f99c73e5ecfcc5ab1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T12:59:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T12:59:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vjwcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://db711940b17fde8f8a00b44de8366688eb730facc06a174bf09456c12c673eb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://db711940b17fde8f8a00b44de8366688eb730facc06a174bf09456c12c673eb0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T12:59:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T12:59:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vjwcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a04ebb8ce267454f3068833092d9644e22484d6d3ea6c89e20fe396e5ea1fe9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a04ebb8ce267454f3068833092d9644e22484d6d3ea6c89e20fe396e5ea1fe9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T12:59:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T12:59:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vjwcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://771dcee276aac3571cff4769668348ea4a10d51e792d1604a47c57edac172dec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://771dcee276aac3571cff4769668348ea4a10d51e792d1604a47c57edac172dec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T12:59:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T12:59:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vjwcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c8b3f0e84cfea43942d32d84b09db7348002648527ea7d65a716489f8507eeb5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c8b3f0e84cfea43942d32d84b09db7348002648527ea7d65a716489f8507eeb5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T12:59:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T12:59:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vjwcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1271c0f5d096f2fc134e178807b44e3a5980ca67d64f0e0d73f566b348206a55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1271c0f5d096f2fc134e178807b44e3a5980ca67d64f0e0d73f566b348206a55\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T12:59:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T12:59:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vjwcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T12:59:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-829dv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T13:00:06Z is after 2025-08-24T17:21:41Z" Oct 02 13:00:06 crc kubenswrapper[4724]: I1002 13:00:06.550693 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-74k4t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6090eaa-c182-4788-950c-16352c271233\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a3396cd890f4ef8af351a4a3065c2324c20bddc7e079bb78e5a02e9fc8269c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fb6lr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4d5dbca156adf53ca2d3b237e15b6dd4387249e73dbf1a7fb3e7e39c53d1b548\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fb6lr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T12:59:21Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-74k4t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T13:00:06Z is after 2025-08-24T17:21:41Z" Oct 02 13:00:06 crc kubenswrapper[4724]: I1002 13:00:06.560703 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-q7t2t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"32e04071-6b34-4fc0-9783-f346a72fcf99\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpmhc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpmhc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T12:59:35Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-q7t2t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T13:00:06Z is after 2025-08-24T17:21:41Z" Oct 02 13:00:06 crc kubenswrapper[4724]: I1002 13:00:06.630054 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 13:00:06 crc kubenswrapper[4724]: I1002 13:00:06.630100 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 13:00:06 crc kubenswrapper[4724]: I1002 13:00:06.630112 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 13:00:06 crc kubenswrapper[4724]: I1002 13:00:06.630132 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 13:00:06 crc kubenswrapper[4724]: I1002 13:00:06.630147 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T13:00:06Z","lastTransitionTime":"2025-10-02T13:00:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 13:00:06 crc kubenswrapper[4724]: I1002 13:00:06.733731 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 13:00:06 crc kubenswrapper[4724]: I1002 13:00:06.733787 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 13:00:06 crc kubenswrapper[4724]: I1002 13:00:06.733797 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 13:00:06 crc kubenswrapper[4724]: I1002 13:00:06.733813 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 13:00:06 crc kubenswrapper[4724]: I1002 13:00:06.733823 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T13:00:06Z","lastTransitionTime":"2025-10-02T13:00:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 13:00:06 crc kubenswrapper[4724]: I1002 13:00:06.836498 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 13:00:06 crc kubenswrapper[4724]: I1002 13:00:06.836528 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 13:00:06 crc kubenswrapper[4724]: I1002 13:00:06.836555 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 13:00:06 crc kubenswrapper[4724]: I1002 13:00:06.836569 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 13:00:06 crc kubenswrapper[4724]: I1002 13:00:06.836578 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T13:00:06Z","lastTransitionTime":"2025-10-02T13:00:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 13:00:06 crc kubenswrapper[4724]: I1002 13:00:06.938988 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 13:00:06 crc kubenswrapper[4724]: I1002 13:00:06.939054 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 13:00:06 crc kubenswrapper[4724]: I1002 13:00:06.939068 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 13:00:06 crc kubenswrapper[4724]: I1002 13:00:06.939090 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 13:00:06 crc kubenswrapper[4724]: I1002 13:00:06.939105 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T13:00:06Z","lastTransitionTime":"2025-10-02T13:00:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 13:00:07 crc kubenswrapper[4724]: I1002 13:00:07.042306 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 13:00:07 crc kubenswrapper[4724]: I1002 13:00:07.042345 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 13:00:07 crc kubenswrapper[4724]: I1002 13:00:07.042355 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 13:00:07 crc kubenswrapper[4724]: I1002 13:00:07.042370 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 13:00:07 crc kubenswrapper[4724]: I1002 13:00:07.042381 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T13:00:07Z","lastTransitionTime":"2025-10-02T13:00:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 13:00:07 crc kubenswrapper[4724]: I1002 13:00:07.145074 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 13:00:07 crc kubenswrapper[4724]: I1002 13:00:07.145124 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 13:00:07 crc kubenswrapper[4724]: I1002 13:00:07.145145 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 13:00:07 crc kubenswrapper[4724]: I1002 13:00:07.145169 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 13:00:07 crc kubenswrapper[4724]: I1002 13:00:07.145180 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T13:00:07Z","lastTransitionTime":"2025-10-02T13:00:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 13:00:07 crc kubenswrapper[4724]: I1002 13:00:07.211798 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 13:00:07 crc kubenswrapper[4724]: I1002 13:00:07.211845 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 13:00:07 crc kubenswrapper[4724]: I1002 13:00:07.211874 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 13:00:07 crc kubenswrapper[4724]: I1002 13:00:07.211891 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 13:00:07 crc kubenswrapper[4724]: I1002 13:00:07.211902 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T13:00:07Z","lastTransitionTime":"2025-10-02T13:00:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 13:00:07 crc kubenswrapper[4724]: E1002 13:00:07.226559 4724 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T13:00:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T13:00:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T13:00:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T13:00:07Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T13:00:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T13:00:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T13:00:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T13:00:07Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"1650a7ad-7086-4fb1-9f9b-b9368db16424\\\",\\\"systemUUID\\\":\\\"5e560baf-345b-4d65-984c-1cfbf6a74dd2\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T13:00:07Z is after 2025-08-24T17:21:41Z" Oct 02 13:00:07 crc kubenswrapper[4724]: I1002 13:00:07.230429 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 13:00:07 crc kubenswrapper[4724]: I1002 13:00:07.230458 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 13:00:07 crc kubenswrapper[4724]: I1002 13:00:07.230468 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 13:00:07 crc kubenswrapper[4724]: I1002 13:00:07.230482 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 13:00:07 crc kubenswrapper[4724]: I1002 13:00:07.230493 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T13:00:07Z","lastTransitionTime":"2025-10-02T13:00:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 13:00:07 crc kubenswrapper[4724]: I1002 13:00:07.242062 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/32e04071-6b34-4fc0-9783-f346a72fcf99-metrics-certs\") pod \"network-metrics-daemon-q7t2t\" (UID: \"32e04071-6b34-4fc0-9783-f346a72fcf99\") " pod="openshift-multus/network-metrics-daemon-q7t2t" Oct 02 13:00:07 crc kubenswrapper[4724]: E1002 13:00:07.241972 4724 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T13:00:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T13:00:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T13:00:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T13:00:07Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T13:00:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T13:00:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T13:00:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T13:00:07Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"1650a7ad-7086-4fb1-9f9b-b9368db16424\\\",\\\"systemUUID\\\":\\\"5e560baf-345b-4d65-984c-1cfbf6a74dd2\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T13:00:07Z is after 2025-08-24T17:21:41Z" Oct 02 13:00:07 crc kubenswrapper[4724]: E1002 13:00:07.242218 4724 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Oct 02 13:00:07 crc kubenswrapper[4724]: E1002 13:00:07.242282 4724 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/32e04071-6b34-4fc0-9783-f346a72fcf99-metrics-certs podName:32e04071-6b34-4fc0-9783-f346a72fcf99 nodeName:}" failed. No retries permitted until 2025-10-02 13:00:39.242262736 +0000 UTC m=+103.697021857 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/32e04071-6b34-4fc0-9783-f346a72fcf99-metrics-certs") pod "network-metrics-daemon-q7t2t" (UID: "32e04071-6b34-4fc0-9783-f346a72fcf99") : object "openshift-multus"/"metrics-daemon-secret" not registered Oct 02 13:00:07 crc kubenswrapper[4724]: I1002 13:00:07.246374 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 13:00:07 crc kubenswrapper[4724]: I1002 13:00:07.246421 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 13:00:07 crc kubenswrapper[4724]: I1002 13:00:07.246432 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 13:00:07 crc kubenswrapper[4724]: I1002 13:00:07.246450 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 13:00:07 crc kubenswrapper[4724]: I1002 13:00:07.246491 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T13:00:07Z","lastTransitionTime":"2025-10-02T13:00:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 13:00:07 crc kubenswrapper[4724]: E1002 13:00:07.258840 4724 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T13:00:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T13:00:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T13:00:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T13:00:07Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T13:00:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T13:00:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T13:00:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T13:00:07Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"1650a7ad-7086-4fb1-9f9b-b9368db16424\\\",\\\"systemUUID\\\":\\\"5e560baf-345b-4d65-984c-1cfbf6a74dd2\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T13:00:07Z is after 2025-08-24T17:21:41Z" Oct 02 13:00:07 crc kubenswrapper[4724]: I1002 13:00:07.262708 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 13:00:07 crc kubenswrapper[4724]: I1002 13:00:07.262735 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 13:00:07 crc kubenswrapper[4724]: I1002 13:00:07.262746 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 13:00:07 crc kubenswrapper[4724]: I1002 13:00:07.262758 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 13:00:07 crc kubenswrapper[4724]: I1002 13:00:07.262767 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T13:00:07Z","lastTransitionTime":"2025-10-02T13:00:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 13:00:07 crc kubenswrapper[4724]: E1002 13:00:07.275205 4724 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T13:00:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T13:00:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T13:00:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T13:00:07Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T13:00:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T13:00:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T13:00:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T13:00:07Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"1650a7ad-7086-4fb1-9f9b-b9368db16424\\\",\\\"systemUUID\\\":\\\"5e560baf-345b-4d65-984c-1cfbf6a74dd2\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T13:00:07Z is after 2025-08-24T17:21:41Z" Oct 02 13:00:07 crc kubenswrapper[4724]: I1002 13:00:07.278957 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 13:00:07 crc kubenswrapper[4724]: I1002 13:00:07.278994 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 13:00:07 crc kubenswrapper[4724]: I1002 13:00:07.279006 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 13:00:07 crc kubenswrapper[4724]: I1002 13:00:07.279024 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 13:00:07 crc kubenswrapper[4724]: I1002 13:00:07.279036 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T13:00:07Z","lastTransitionTime":"2025-10-02T13:00:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 13:00:07 crc kubenswrapper[4724]: E1002 13:00:07.292088 4724 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T13:00:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T13:00:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T13:00:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T13:00:07Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T13:00:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T13:00:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T13:00:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T13:00:07Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"1650a7ad-7086-4fb1-9f9b-b9368db16424\\\",\\\"systemUUID\\\":\\\"5e560baf-345b-4d65-984c-1cfbf6a74dd2\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T13:00:07Z is after 2025-08-24T17:21:41Z" Oct 02 13:00:07 crc kubenswrapper[4724]: E1002 13:00:07.292273 4724 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Oct 02 13:00:07 crc kubenswrapper[4724]: I1002 13:00:07.295932 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 13:00:07 crc kubenswrapper[4724]: I1002 13:00:07.295980 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 13:00:07 crc kubenswrapper[4724]: I1002 13:00:07.295996 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 13:00:07 crc kubenswrapper[4724]: I1002 13:00:07.296017 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 13:00:07 crc kubenswrapper[4724]: I1002 13:00:07.296032 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T13:00:07Z","lastTransitionTime":"2025-10-02T13:00:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 13:00:07 crc kubenswrapper[4724]: I1002 13:00:07.313443 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-q7t2t" Oct 02 13:00:07 crc kubenswrapper[4724]: I1002 13:00:07.313781 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 13:00:07 crc kubenswrapper[4724]: I1002 13:00:07.313872 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 13:00:07 crc kubenswrapper[4724]: E1002 13:00:07.313958 4724 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-q7t2t" podUID="32e04071-6b34-4fc0-9783-f346a72fcf99" Oct 02 13:00:07 crc kubenswrapper[4724]: I1002 13:00:07.314271 4724 scope.go:117] "RemoveContainer" containerID="28aaaa47f8700a0b8725c20b81e95579d97b1548baea9e9cea1fec062cd76f79" Oct 02 13:00:07 crc kubenswrapper[4724]: E1002 13:00:07.314430 4724 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-w58lt_openshift-ovn-kubernetes(4089ad23-969c-4222-a8ed-e141ec291e80)\"" pod="openshift-ovn-kubernetes/ovnkube-node-w58lt" podUID="4089ad23-969c-4222-a8ed-e141ec291e80" Oct 02 13:00:07 crc kubenswrapper[4724]: I1002 13:00:07.314438 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 13:00:07 crc kubenswrapper[4724]: E1002 13:00:07.314482 4724 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 02 13:00:07 crc kubenswrapper[4724]: E1002 13:00:07.314528 4724 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 02 13:00:07 crc kubenswrapper[4724]: E1002 13:00:07.314613 4724 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 02 13:00:07 crc kubenswrapper[4724]: I1002 13:00:07.398687 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 13:00:07 crc kubenswrapper[4724]: I1002 13:00:07.398727 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 13:00:07 crc kubenswrapper[4724]: I1002 13:00:07.398737 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 13:00:07 crc kubenswrapper[4724]: I1002 13:00:07.398755 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 13:00:07 crc kubenswrapper[4724]: I1002 13:00:07.398768 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T13:00:07Z","lastTransitionTime":"2025-10-02T13:00:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 13:00:07 crc kubenswrapper[4724]: I1002 13:00:07.501704 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 13:00:07 crc kubenswrapper[4724]: I1002 13:00:07.501740 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 13:00:07 crc kubenswrapper[4724]: I1002 13:00:07.501748 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 13:00:07 crc kubenswrapper[4724]: I1002 13:00:07.501765 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 13:00:07 crc kubenswrapper[4724]: I1002 13:00:07.501775 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T13:00:07Z","lastTransitionTime":"2025-10-02T13:00:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 13:00:07 crc kubenswrapper[4724]: I1002 13:00:07.604486 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 13:00:07 crc kubenswrapper[4724]: I1002 13:00:07.604556 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 13:00:07 crc kubenswrapper[4724]: I1002 13:00:07.604567 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 13:00:07 crc kubenswrapper[4724]: I1002 13:00:07.604590 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 13:00:07 crc kubenswrapper[4724]: I1002 13:00:07.604603 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T13:00:07Z","lastTransitionTime":"2025-10-02T13:00:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 13:00:07 crc kubenswrapper[4724]: I1002 13:00:07.707531 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 13:00:07 crc kubenswrapper[4724]: I1002 13:00:07.707585 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 13:00:07 crc kubenswrapper[4724]: I1002 13:00:07.707594 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 13:00:07 crc kubenswrapper[4724]: I1002 13:00:07.707609 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 13:00:07 crc kubenswrapper[4724]: I1002 13:00:07.707624 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T13:00:07Z","lastTransitionTime":"2025-10-02T13:00:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 13:00:07 crc kubenswrapper[4724]: I1002 13:00:07.809584 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 13:00:07 crc kubenswrapper[4724]: I1002 13:00:07.809624 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 13:00:07 crc kubenswrapper[4724]: I1002 13:00:07.809632 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 13:00:07 crc kubenswrapper[4724]: I1002 13:00:07.809648 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 13:00:07 crc kubenswrapper[4724]: I1002 13:00:07.809658 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T13:00:07Z","lastTransitionTime":"2025-10-02T13:00:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 13:00:07 crc kubenswrapper[4724]: I1002 13:00:07.912475 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 13:00:07 crc kubenswrapper[4724]: I1002 13:00:07.912509 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 13:00:07 crc kubenswrapper[4724]: I1002 13:00:07.912516 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 13:00:07 crc kubenswrapper[4724]: I1002 13:00:07.912546 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 13:00:07 crc kubenswrapper[4724]: I1002 13:00:07.912556 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T13:00:07Z","lastTransitionTime":"2025-10-02T13:00:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 13:00:08 crc kubenswrapper[4724]: I1002 13:00:08.015983 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 13:00:08 crc kubenswrapper[4724]: I1002 13:00:08.016025 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 13:00:08 crc kubenswrapper[4724]: I1002 13:00:08.016036 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 13:00:08 crc kubenswrapper[4724]: I1002 13:00:08.016053 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 13:00:08 crc kubenswrapper[4724]: I1002 13:00:08.016066 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T13:00:08Z","lastTransitionTime":"2025-10-02T13:00:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 13:00:08 crc kubenswrapper[4724]: I1002 13:00:08.118103 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 13:00:08 crc kubenswrapper[4724]: I1002 13:00:08.118148 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 13:00:08 crc kubenswrapper[4724]: I1002 13:00:08.118157 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 13:00:08 crc kubenswrapper[4724]: I1002 13:00:08.118180 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 13:00:08 crc kubenswrapper[4724]: I1002 13:00:08.118198 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T13:00:08Z","lastTransitionTime":"2025-10-02T13:00:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 13:00:08 crc kubenswrapper[4724]: I1002 13:00:08.220960 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 13:00:08 crc kubenswrapper[4724]: I1002 13:00:08.221006 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 13:00:08 crc kubenswrapper[4724]: I1002 13:00:08.221016 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 13:00:08 crc kubenswrapper[4724]: I1002 13:00:08.221034 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 13:00:08 crc kubenswrapper[4724]: I1002 13:00:08.221050 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T13:00:08Z","lastTransitionTime":"2025-10-02T13:00:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 13:00:08 crc kubenswrapper[4724]: I1002 13:00:08.323072 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 13:00:08 crc kubenswrapper[4724]: I1002 13:00:08.323113 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 13:00:08 crc kubenswrapper[4724]: I1002 13:00:08.323126 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 13:00:08 crc kubenswrapper[4724]: I1002 13:00:08.323140 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 13:00:08 crc kubenswrapper[4724]: I1002 13:00:08.323150 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T13:00:08Z","lastTransitionTime":"2025-10-02T13:00:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 13:00:08 crc kubenswrapper[4724]: I1002 13:00:08.425907 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 13:00:08 crc kubenswrapper[4724]: I1002 13:00:08.425959 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 13:00:08 crc kubenswrapper[4724]: I1002 13:00:08.425971 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 13:00:08 crc kubenswrapper[4724]: I1002 13:00:08.425985 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 13:00:08 crc kubenswrapper[4724]: I1002 13:00:08.425994 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T13:00:08Z","lastTransitionTime":"2025-10-02T13:00:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 13:00:08 crc kubenswrapper[4724]: I1002 13:00:08.529222 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 13:00:08 crc kubenswrapper[4724]: I1002 13:00:08.529257 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 13:00:08 crc kubenswrapper[4724]: I1002 13:00:08.529266 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 13:00:08 crc kubenswrapper[4724]: I1002 13:00:08.529281 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 13:00:08 crc kubenswrapper[4724]: I1002 13:00:08.529291 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T13:00:08Z","lastTransitionTime":"2025-10-02T13:00:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 13:00:08 crc kubenswrapper[4724]: I1002 13:00:08.632449 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 13:00:08 crc kubenswrapper[4724]: I1002 13:00:08.632484 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 13:00:08 crc kubenswrapper[4724]: I1002 13:00:08.632495 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 13:00:08 crc kubenswrapper[4724]: I1002 13:00:08.632513 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 13:00:08 crc kubenswrapper[4724]: I1002 13:00:08.632523 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T13:00:08Z","lastTransitionTime":"2025-10-02T13:00:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 13:00:08 crc kubenswrapper[4724]: I1002 13:00:08.736061 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 13:00:08 crc kubenswrapper[4724]: I1002 13:00:08.736112 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 13:00:08 crc kubenswrapper[4724]: I1002 13:00:08.736122 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 13:00:08 crc kubenswrapper[4724]: I1002 13:00:08.736146 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 13:00:08 crc kubenswrapper[4724]: I1002 13:00:08.736166 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T13:00:08Z","lastTransitionTime":"2025-10-02T13:00:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 13:00:08 crc kubenswrapper[4724]: I1002 13:00:08.839025 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 13:00:08 crc kubenswrapper[4724]: I1002 13:00:08.839068 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 13:00:08 crc kubenswrapper[4724]: I1002 13:00:08.839079 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 13:00:08 crc kubenswrapper[4724]: I1002 13:00:08.839094 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 13:00:08 crc kubenswrapper[4724]: I1002 13:00:08.839104 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T13:00:08Z","lastTransitionTime":"2025-10-02T13:00:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 13:00:08 crc kubenswrapper[4724]: I1002 13:00:08.942006 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 13:00:08 crc kubenswrapper[4724]: I1002 13:00:08.942045 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 13:00:08 crc kubenswrapper[4724]: I1002 13:00:08.942054 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 13:00:08 crc kubenswrapper[4724]: I1002 13:00:08.942071 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 13:00:08 crc kubenswrapper[4724]: I1002 13:00:08.942082 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T13:00:08Z","lastTransitionTime":"2025-10-02T13:00:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 13:00:09 crc kubenswrapper[4724]: I1002 13:00:09.044988 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 13:00:09 crc kubenswrapper[4724]: I1002 13:00:09.045025 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 13:00:09 crc kubenswrapper[4724]: I1002 13:00:09.045038 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 13:00:09 crc kubenswrapper[4724]: I1002 13:00:09.045055 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 13:00:09 crc kubenswrapper[4724]: I1002 13:00:09.045067 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T13:00:09Z","lastTransitionTime":"2025-10-02T13:00:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 13:00:09 crc kubenswrapper[4724]: I1002 13:00:09.147766 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 13:00:09 crc kubenswrapper[4724]: I1002 13:00:09.147820 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 13:00:09 crc kubenswrapper[4724]: I1002 13:00:09.147837 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 13:00:09 crc kubenswrapper[4724]: I1002 13:00:09.147859 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 13:00:09 crc kubenswrapper[4724]: I1002 13:00:09.147871 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T13:00:09Z","lastTransitionTime":"2025-10-02T13:00:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 13:00:09 crc kubenswrapper[4724]: I1002 13:00:09.250845 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 13:00:09 crc kubenswrapper[4724]: I1002 13:00:09.250918 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 13:00:09 crc kubenswrapper[4724]: I1002 13:00:09.250932 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 13:00:09 crc kubenswrapper[4724]: I1002 13:00:09.250951 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 13:00:09 crc kubenswrapper[4724]: I1002 13:00:09.250965 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T13:00:09Z","lastTransitionTime":"2025-10-02T13:00:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 13:00:09 crc kubenswrapper[4724]: I1002 13:00:09.313069 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 13:00:09 crc kubenswrapper[4724]: I1002 13:00:09.313145 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-q7t2t" Oct 02 13:00:09 crc kubenswrapper[4724]: I1002 13:00:09.313228 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 13:00:09 crc kubenswrapper[4724]: I1002 13:00:09.313249 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 13:00:09 crc kubenswrapper[4724]: E1002 13:00:09.313395 4724 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 02 13:00:09 crc kubenswrapper[4724]: E1002 13:00:09.313549 4724 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-q7t2t" podUID="32e04071-6b34-4fc0-9783-f346a72fcf99" Oct 02 13:00:09 crc kubenswrapper[4724]: E1002 13:00:09.313768 4724 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 02 13:00:09 crc kubenswrapper[4724]: E1002 13:00:09.314007 4724 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 02 13:00:09 crc kubenswrapper[4724]: I1002 13:00:09.353496 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 13:00:09 crc kubenswrapper[4724]: I1002 13:00:09.353573 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 13:00:09 crc kubenswrapper[4724]: I1002 13:00:09.353594 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 13:00:09 crc kubenswrapper[4724]: I1002 13:00:09.353616 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 13:00:09 crc kubenswrapper[4724]: I1002 13:00:09.353630 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T13:00:09Z","lastTransitionTime":"2025-10-02T13:00:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 13:00:09 crc kubenswrapper[4724]: I1002 13:00:09.460849 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 13:00:09 crc kubenswrapper[4724]: I1002 13:00:09.460901 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 13:00:09 crc kubenswrapper[4724]: I1002 13:00:09.460917 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 13:00:09 crc kubenswrapper[4724]: I1002 13:00:09.460938 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 13:00:09 crc kubenswrapper[4724]: I1002 13:00:09.460951 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T13:00:09Z","lastTransitionTime":"2025-10-02T13:00:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 13:00:09 crc kubenswrapper[4724]: I1002 13:00:09.563944 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 13:00:09 crc kubenswrapper[4724]: I1002 13:00:09.564000 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 13:00:09 crc kubenswrapper[4724]: I1002 13:00:09.564013 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 13:00:09 crc kubenswrapper[4724]: I1002 13:00:09.564031 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 13:00:09 crc kubenswrapper[4724]: I1002 13:00:09.564048 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T13:00:09Z","lastTransitionTime":"2025-10-02T13:00:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 13:00:09 crc kubenswrapper[4724]: I1002 13:00:09.666888 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 13:00:09 crc kubenswrapper[4724]: I1002 13:00:09.666926 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 13:00:09 crc kubenswrapper[4724]: I1002 13:00:09.666935 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 13:00:09 crc kubenswrapper[4724]: I1002 13:00:09.666953 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 13:00:09 crc kubenswrapper[4724]: I1002 13:00:09.666964 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T13:00:09Z","lastTransitionTime":"2025-10-02T13:00:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 13:00:09 crc kubenswrapper[4724]: I1002 13:00:09.770000 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 13:00:09 crc kubenswrapper[4724]: I1002 13:00:09.770043 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 13:00:09 crc kubenswrapper[4724]: I1002 13:00:09.770054 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 13:00:09 crc kubenswrapper[4724]: I1002 13:00:09.770076 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 13:00:09 crc kubenswrapper[4724]: I1002 13:00:09.770098 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T13:00:09Z","lastTransitionTime":"2025-10-02T13:00:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 13:00:09 crc kubenswrapper[4724]: I1002 13:00:09.816516 4724 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-pr276_c67e3c27-01fd-47f5-a7fb-a2ae1e7753d4/kube-multus/0.log" Oct 02 13:00:09 crc kubenswrapper[4724]: I1002 13:00:09.816596 4724 generic.go:334] "Generic (PLEG): container finished" podID="c67e3c27-01fd-47f5-a7fb-a2ae1e7753d4" containerID="5fe9713cb9b004911140ef4802a6755a4d2a781fde439d7318ace93e4b608cef" exitCode=1 Oct 02 13:00:09 crc kubenswrapper[4724]: I1002 13:00:09.816630 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-pr276" event={"ID":"c67e3c27-01fd-47f5-a7fb-a2ae1e7753d4","Type":"ContainerDied","Data":"5fe9713cb9b004911140ef4802a6755a4d2a781fde439d7318ace93e4b608cef"} Oct 02 13:00:09 crc kubenswrapper[4724]: I1002 13:00:09.817066 4724 scope.go:117] "RemoveContainer" containerID="5fe9713cb9b004911140ef4802a6755a4d2a781fde439d7318ace93e4b608cef" Oct 02 13:00:09 crc kubenswrapper[4724]: I1002 13:00:09.833510 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df76441bbfa05afa242a77e10e6f855d811a9432b790630a9fa5f359ccb6d457\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T13:00:09Z is after 2025-08-24T17:21:41Z" Oct 02 13:00:09 crc kubenswrapper[4724]: I1002 13:00:09.847607 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://16ca55ea7e67fb6219b1c835d960d1f9ec6a18a98789ee961e44a278fad9add7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://72eefb1c1aad15d7ad34aa4384c7d0a20a4c886225cb7d28f0597cf01ad35693\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T13:00:09Z is after 2025-08-24T17:21:41Z" Oct 02 13:00:09 crc kubenswrapper[4724]: I1002 13:00:09.860850 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:19Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T13:00:09Z is after 2025-08-24T17:21:41Z" Oct 02 13:00:09 crc kubenswrapper[4724]: I1002 13:00:09.872121 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 13:00:09 crc kubenswrapper[4724]: I1002 13:00:09.872694 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 13:00:09 crc kubenswrapper[4724]: I1002 13:00:09.872790 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 13:00:09 crc kubenswrapper[4724]: I1002 13:00:09.872891 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 13:00:09 crc kubenswrapper[4724]: I1002 13:00:09.872999 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T13:00:09Z","lastTransitionTime":"2025-10-02T13:00:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 13:00:09 crc kubenswrapper[4724]: I1002 13:00:09.875475 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-829dv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b90b9e02-3565-4ad5-8f8c-eec339fc499c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://12d1d4558bef1a0a974bf27f9547c646e216ed5001227ab980e89b05c2a7b5ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vjwcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://55a18158dd67ef4667e4a6bc9684c4f4824f848b6e151a0f99c73e5ecfcc5ab1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://55a18158dd67ef4667e4a6bc9684c4f4824f848b6e151a0f99c73e5ecfcc5ab1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T12:59:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T12:59:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vjwcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://db711940b17fde8f8a00b44de8366688eb730facc06a174bf09456c12c673eb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://db711940b17fde8f8a00b44de8366688eb730facc06a174bf09456c12c673eb0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T12:59:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T12:59:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vjwcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a04ebb8ce267454f3068833092d9644e22484d6d3ea6c89e20fe396e5ea1fe9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a04ebb8ce267454f3068833092d9644e22484d6d3ea6c89e20fe396e5ea1fe9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T12:59:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T12:59:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vjwcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://771dcee276aac3571cff4769668348ea4a10d51e792d1604a47c57edac172dec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://771dcee276aac3571cff4769668348ea4a10d51e792d1604a47c57edac172dec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T12:59:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T12:59:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vjwcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c8b3f0e84cfea43942d32d84b09db7348002648527ea7d65a716489f8507eeb5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c8b3f0e84cfea43942d32d84b09db7348002648527ea7d65a716489f8507eeb5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T12:59:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T12:59:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vjwcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1271c0f5d096f2fc134e178807b44e3a5980ca67d64f0e0d73f566b348206a55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1271c0f5d096f2fc134e178807b44e3a5980ca67d64f0e0d73f566b348206a55\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T12:59:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T12:59:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vjwcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T12:59:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-829dv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T13:00:09Z is after 2025-08-24T17:21:41Z" Oct 02 13:00:09 crc kubenswrapper[4724]: I1002 13:00:09.888980 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-74k4t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6090eaa-c182-4788-950c-16352c271233\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a3396cd890f4ef8af351a4a3065c2324c20bddc7e079bb78e5a02e9fc8269c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fb6lr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4d5dbca156adf53ca2d3b237e15b6dd4387249e73dbf1a7fb3e7e39c53d1b548\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fb6lr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T12:59:21Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-74k4t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T13:00:09Z is after 2025-08-24T17:21:41Z" Oct 02 13:00:09 crc kubenswrapper[4724]: I1002 13:00:09.899170 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-q7t2t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"32e04071-6b34-4fc0-9783-f346a72fcf99\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpmhc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpmhc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T12:59:35Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-q7t2t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T13:00:09Z is after 2025-08-24T17:21:41Z" Oct 02 13:00:09 crc kubenswrapper[4724]: I1002 13:00:09.911197 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7cb9bd1f-850c-4e5f-bcd0-cd9cdca06604\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:58:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:58:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:58:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4c3fabef61b20063aa2de1008b2b1fda94f39ddd4d0ad905d1a954955ffff7d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:58:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d7393e9d92fc40b10b30d788707b77d3c3e28fb075c618da4c3575dc0a1c1a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:58:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7459703a380feaeff882176dc352b69d3e13089a18d5da98bcebd41018d32091\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:58:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a8e76f0ac3479d575a0b073c5b9ad857711fc06bca31e147167babe0fb164614\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a8e76f0ac3479d575a0b073c5b9ad857711fc06bca31e147167babe0fb164614\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T12:58:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T12:58:57Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T12:58:56Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T13:00:09Z is after 2025-08-24T17:21:41Z" Oct 02 13:00:09 crc kubenswrapper[4724]: I1002 13:00:09.929022 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-w58lt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4089ad23-969c-4222-a8ed-e141ec291e80\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://197d101ac60dc4e22bd91f0bccaf0c5d9461d2bb94425b5c6d4ca75aad0f7e25\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sxqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3940d1762b165816101228f10689c4897ca3909b295c51368935e2d28d32e9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sxqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://300ea92580dcf51b685b378397b0dd067899d424f4394bce09a8242415acba64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sxqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6bc33e00e91c1177fe2762cb5295c78f4b00c1861d44446269e2e7668b25ac83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sxqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f4412d5281a9c03ea5b251d17371e363b9c5a970bad0db12adf3f75ddd1f955\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sxqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4186fbd2d282edbe581a4d8d3616dff86c3e6cdbe797999fa57f6034870e7742\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sxqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://28aaaa47f8700a0b8725c20b81e95579d97b1548baea9e9cea1fec062cd76f79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://28aaaa47f8700a0b8725c20b81e95579d97b1548baea9e9cea1fec062cd76f79\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-02T12:59:54Z\\\",\\\"message\\\":\\\"693e1320bd}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1002 12:59:54.044422 6454 obj_retry.go:303] Retry object setup: *v1.Pod openshift-network-console/networking-console-plugin-85b44fc459-gdk6g\\\\nI1002 12:59:54.044420 6454 obj_retry.go:303] Retry object setup: *v1.Pod openshift-network-diagnostics/network-check-source-55646444c4-trplf\\\\nI1002 12:59:54.044433 6454 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-console/networking-console-plugin-85b44fc459-gdk6g\\\\nI1002 12:59:54.044442 6454 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-diagnostics/network-check-source-55646444c4-trplf\\\\nF1002 12:59:54.044455 6454 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T12:59:52Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-w58lt_openshift-ovn-kubernetes(4089ad23-969c-4222-a8ed-e141ec291e80)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sxqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c6cd68ea9e76b6dcb1649fafb808953b30389fcaee674b611c655c239a36e7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sxqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9c7841b20007f0436754e669fb6f1a85de1f0ced1c0b2686ae0a20b56ab878f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9c7841b20007f0436754e669fb6f1a85de1f0ced1c0b2686ae0a20b56ab878f2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T12:59:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T12:59:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sxqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T12:59:21Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-w58lt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T13:00:09Z is after 2025-08-24T17:21:41Z" Oct 02 13:00:09 crc kubenswrapper[4724]: I1002 13:00:09.943346 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1527f3c8-7fa1-404b-aafd-7cac512df49d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:58:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:58:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:58:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f058a4a1cf35b7f4800f872139322f2b0096d67548945ba03ecfe6775ffd5103\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:58:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c06801afff9aa03ae9e0bcd8a966f454d81fdfe876806eed22e9d93132989dae\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:58:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff4c45bdd946a4d50c7dc93752d3d1a71b10b830611b64e138a5e1c0eb9e5e30\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:58:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://70f8bb8f5d842643df282ff4c021f3aeba0de7dddc0864e3c1deaf64e604fccc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a3585f00b9d5a19821986d13b0d46f0db3bdd6eef3b4e28f3a63449e0aa0e55\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-02T12:59:19Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1002 12:59:13.975124 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1002 12:59:13.976423 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1894517521/tls.crt::/tmp/serving-cert-1894517521/tls.key\\\\\\\"\\\\nI1002 12:59:19.332110 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1002 12:59:19.335107 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1002 12:59:19.335132 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1002 12:59:19.335161 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1002 12:59:19.335167 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1002 12:59:19.339404 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1002 12:59:19.339433 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 12:59:19.339437 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 12:59:19.339442 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1002 12:59:19.339446 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1002 12:59:19.339452 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1002 12:59:19.339454 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1002 12:59:19.339646 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1002 12:59:19.342508 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T12:59:02Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://25e74bd1660931f7bb8dc824f985c539abcbd4c706a11edf7192876b8796e751\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:58:59Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://366ed0b9a29ba0b7606888087909b51b39abd60dd6b11e6509b8613913461b6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://366ed0b9a29ba0b7606888087909b51b39abd60dd6b11e6509b8613913461b6b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T12:58:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T12:58:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T12:58:56Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T13:00:09Z is after 2025-08-24T17:21:41Z" Oct 02 13:00:09 crc kubenswrapper[4724]: I1002 13:00:09.955437 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8902618f9e9502ff68d0658e199abb68025c551774c41022ed9f3f15dcccba47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T13:00:09Z is after 2025-08-24T17:21:41Z" Oct 02 13:00:09 crc kubenswrapper[4724]: I1002 13:00:09.969051 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T13:00:09Z is after 2025-08-24T17:21:41Z" Oct 02 13:00:09 crc kubenswrapper[4724]: I1002 13:00:09.976026 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 13:00:09 crc kubenswrapper[4724]: I1002 13:00:09.976056 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 13:00:09 crc kubenswrapper[4724]: I1002 13:00:09.976065 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 13:00:09 crc kubenswrapper[4724]: I1002 13:00:09.976079 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 13:00:09 crc kubenswrapper[4724]: I1002 13:00:09.976090 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T13:00:09Z","lastTransitionTime":"2025-10-02T13:00:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 13:00:09 crc kubenswrapper[4724]: I1002 13:00:09.981383 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:19Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T13:00:09Z is after 2025-08-24T17:21:41Z" Oct 02 13:00:09 crc kubenswrapper[4724]: I1002 13:00:09.995245 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-pr276" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c67e3c27-01fd-47f5-a7fb-a2ae1e7753d4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T13:00:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T13:00:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5fe9713cb9b004911140ef4802a6755a4d2a781fde439d7318ace93e4b608cef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5fe9713cb9b004911140ef4802a6755a4d2a781fde439d7318ace93e4b608cef\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-02T13:00:09Z\\\",\\\"message\\\":\\\"2025-10-02T12:59:23+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_52b6ea56-510d-4c4a-86ca-f2d8b03a329f\\\\n2025-10-02T12:59:23+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_52b6ea56-510d-4c4a-86ca-f2d8b03a329f to /host/opt/cni/bin/\\\\n2025-10-02T12:59:24Z [verbose] multus-daemon started\\\\n2025-10-02T12:59:24Z [verbose] Readiness Indicator file check\\\\n2025-10-02T13:00:09Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T12:59:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c844q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T12:59:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-pr276\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T13:00:09Z is after 2025-08-24T17:21:41Z" Oct 02 13:00:10 crc kubenswrapper[4724]: I1002 13:00:10.006139 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-vv7gr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"58e6c559-83ff-48ec-b337-ddd00852bc3c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://540350e6cb29bb395f31338d08626d3b110cc3adbe0df070045d178281d9c035\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4spmh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T12:59:26Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-vv7gr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T13:00:10Z is after 2025-08-24T17:21:41Z" Oct 02 13:00:10 crc kubenswrapper[4724]: I1002 13:00:10.018041 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-z9692" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aeccba6f-b4bb-4dd5-8ab1-798d7a67251a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://116acf2f64bac5bb00b87c2bf09f16dcec811d4effbacb2ad49e4d2656180b54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znd9t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5971fadd3a2831fd568f1a0e53cefad9dbb4a9c5ad719b4024ec27381842f92e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znd9t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T12:59:33Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-z9692\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T13:00:10Z is after 2025-08-24T17:21:41Z" Oct 02 13:00:10 crc kubenswrapper[4724]: I1002 13:00:10.032092 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"92a0e520-2ed8-4d81-8de4-c0f98916b012\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:58:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:58:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:58:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8d0d24fc1f56963b40bca48e3e923a5b35f9bec48f0b4fabbbdc9163762b2aa6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:58:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cdf23aa5cabd12ba0af299a7184df0341b5a7f7fb6f8159a10c932599fcef08b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:58:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a087f1c52a561564ac405ef652e8cb1542077e507acff22e1189f7370b7ca322\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:58:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a675aca2218d07044b6e9a391f2b60f5cc822790011af97b0c0b704c1bd98d78\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:58:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T12:58:56Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T13:00:10Z is after 2025-08-24T17:21:41Z" Oct 02 13:00:10 crc kubenswrapper[4724]: I1002 13:00:10.043503 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-2mrjk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e0d209f5-ad55-48f5-b4de-51aa5a972c19\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e94572c741d70917793b2482faf4c7fba57dfc4a29ade8824b35109735b602c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8f48\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T12:59:21Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-2mrjk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T13:00:10Z is after 2025-08-24T17:21:41Z" Oct 02 13:00:10 crc kubenswrapper[4724]: I1002 13:00:10.080277 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 13:00:10 crc kubenswrapper[4724]: I1002 13:00:10.080343 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 13:00:10 crc kubenswrapper[4724]: I1002 13:00:10.080355 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 13:00:10 crc kubenswrapper[4724]: I1002 13:00:10.080395 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 13:00:10 crc kubenswrapper[4724]: I1002 13:00:10.080410 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T13:00:10Z","lastTransitionTime":"2025-10-02T13:00:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 13:00:10 crc kubenswrapper[4724]: I1002 13:00:10.182417 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 13:00:10 crc kubenswrapper[4724]: I1002 13:00:10.182474 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 13:00:10 crc kubenswrapper[4724]: I1002 13:00:10.182491 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 13:00:10 crc kubenswrapper[4724]: I1002 13:00:10.182512 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 13:00:10 crc kubenswrapper[4724]: I1002 13:00:10.182525 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T13:00:10Z","lastTransitionTime":"2025-10-02T13:00:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 13:00:10 crc kubenswrapper[4724]: I1002 13:00:10.285167 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 13:00:10 crc kubenswrapper[4724]: I1002 13:00:10.285215 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 13:00:10 crc kubenswrapper[4724]: I1002 13:00:10.285228 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 13:00:10 crc kubenswrapper[4724]: I1002 13:00:10.285247 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 13:00:10 crc kubenswrapper[4724]: I1002 13:00:10.285260 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T13:00:10Z","lastTransitionTime":"2025-10-02T13:00:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 13:00:10 crc kubenswrapper[4724]: I1002 13:00:10.387511 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 13:00:10 crc kubenswrapper[4724]: I1002 13:00:10.387592 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 13:00:10 crc kubenswrapper[4724]: I1002 13:00:10.387607 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 13:00:10 crc kubenswrapper[4724]: I1002 13:00:10.387627 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 13:00:10 crc kubenswrapper[4724]: I1002 13:00:10.387639 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T13:00:10Z","lastTransitionTime":"2025-10-02T13:00:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 13:00:10 crc kubenswrapper[4724]: I1002 13:00:10.490568 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 13:00:10 crc kubenswrapper[4724]: I1002 13:00:10.490626 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 13:00:10 crc kubenswrapper[4724]: I1002 13:00:10.490639 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 13:00:10 crc kubenswrapper[4724]: I1002 13:00:10.490658 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 13:00:10 crc kubenswrapper[4724]: I1002 13:00:10.490672 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T13:00:10Z","lastTransitionTime":"2025-10-02T13:00:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 13:00:10 crc kubenswrapper[4724]: I1002 13:00:10.593872 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 13:00:10 crc kubenswrapper[4724]: I1002 13:00:10.593939 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 13:00:10 crc kubenswrapper[4724]: I1002 13:00:10.593953 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 13:00:10 crc kubenswrapper[4724]: I1002 13:00:10.593976 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 13:00:10 crc kubenswrapper[4724]: I1002 13:00:10.593992 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T13:00:10Z","lastTransitionTime":"2025-10-02T13:00:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 13:00:10 crc kubenswrapper[4724]: I1002 13:00:10.697395 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 13:00:10 crc kubenswrapper[4724]: I1002 13:00:10.697485 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 13:00:10 crc kubenswrapper[4724]: I1002 13:00:10.697497 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 13:00:10 crc kubenswrapper[4724]: I1002 13:00:10.697520 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 13:00:10 crc kubenswrapper[4724]: I1002 13:00:10.697585 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T13:00:10Z","lastTransitionTime":"2025-10-02T13:00:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 13:00:10 crc kubenswrapper[4724]: I1002 13:00:10.800484 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 13:00:10 crc kubenswrapper[4724]: I1002 13:00:10.800527 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 13:00:10 crc kubenswrapper[4724]: I1002 13:00:10.800555 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 13:00:10 crc kubenswrapper[4724]: I1002 13:00:10.800571 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 13:00:10 crc kubenswrapper[4724]: I1002 13:00:10.800582 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T13:00:10Z","lastTransitionTime":"2025-10-02T13:00:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 13:00:10 crc kubenswrapper[4724]: I1002 13:00:10.821795 4724 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-pr276_c67e3c27-01fd-47f5-a7fb-a2ae1e7753d4/kube-multus/0.log" Oct 02 13:00:10 crc kubenswrapper[4724]: I1002 13:00:10.821873 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-pr276" event={"ID":"c67e3c27-01fd-47f5-a7fb-a2ae1e7753d4","Type":"ContainerStarted","Data":"f6c546eb3faa335d37ee1c08a88b2d409a5ee23ed42ca9fd7104fb0bb76ecf15"} Oct 02 13:00:10 crc kubenswrapper[4724]: I1002 13:00:10.834933 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7cb9bd1f-850c-4e5f-bcd0-cd9cdca06604\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:58:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:58:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:58:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4c3fabef61b20063aa2de1008b2b1fda94f39ddd4d0ad905d1a954955ffff7d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:58:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d7393e9d92fc40b10b30d788707b77d3c3e28fb075c618da4c3575dc0a1c1a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:58:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7459703a380feaeff882176dc352b69d3e13089a18d5da98bcebd41018d32091\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:58:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a8e76f0ac3479d575a0b073c5b9ad857711fc06bca31e147167babe0fb164614\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a8e76f0ac3479d575a0b073c5b9ad857711fc06bca31e147167babe0fb164614\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T12:58:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T12:58:57Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T12:58:56Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T13:00:10Z is after 2025-08-24T17:21:41Z" Oct 02 13:00:10 crc kubenswrapper[4724]: I1002 13:00:10.847963 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df76441bbfa05afa242a77e10e6f855d811a9432b790630a9fa5f359ccb6d457\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T13:00:10Z is after 2025-08-24T17:21:41Z" Oct 02 13:00:10 crc kubenswrapper[4724]: I1002 13:00:10.863569 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://16ca55ea7e67fb6219b1c835d960d1f9ec6a18a98789ee961e44a278fad9add7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://72eefb1c1aad15d7ad34aa4384c7d0a20a4c886225cb7d28f0597cf01ad35693\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T13:00:10Z is after 2025-08-24T17:21:41Z" Oct 02 13:00:10 crc kubenswrapper[4724]: I1002 13:00:10.876177 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:19Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T13:00:10Z is after 2025-08-24T17:21:41Z" Oct 02 13:00:10 crc kubenswrapper[4724]: I1002 13:00:10.890643 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-829dv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b90b9e02-3565-4ad5-8f8c-eec339fc499c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://12d1d4558bef1a0a974bf27f9547c646e216ed5001227ab980e89b05c2a7b5ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vjwcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://55a18158dd67ef4667e4a6bc9684c4f4824f848b6e151a0f99c73e5ecfcc5ab1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://55a18158dd67ef4667e4a6bc9684c4f4824f848b6e151a0f99c73e5ecfcc5ab1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T12:59:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T12:59:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vjwcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://db711940b17fde8f8a00b44de8366688eb730facc06a174bf09456c12c673eb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://db711940b17fde8f8a00b44de8366688eb730facc06a174bf09456c12c673eb0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T12:59:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T12:59:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vjwcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a04ebb8ce267454f3068833092d9644e22484d6d3ea6c89e20fe396e5ea1fe9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a04ebb8ce267454f3068833092d9644e22484d6d3ea6c89e20fe396e5ea1fe9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T12:59:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T12:59:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vjwcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://771dcee276aac3571cff4769668348ea4a10d51e792d1604a47c57edac172dec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://771dcee276aac3571cff4769668348ea4a10d51e792d1604a47c57edac172dec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T12:59:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T12:59:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vjwcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c8b3f0e84cfea43942d32d84b09db7348002648527ea7d65a716489f8507eeb5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c8b3f0e84cfea43942d32d84b09db7348002648527ea7d65a716489f8507eeb5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T12:59:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T12:59:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vjwcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1271c0f5d096f2fc134e178807b44e3a5980ca67d64f0e0d73f566b348206a55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1271c0f5d096f2fc134e178807b44e3a5980ca67d64f0e0d73f566b348206a55\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T12:59:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T12:59:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vjwcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T12:59:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-829dv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T13:00:10Z is after 2025-08-24T17:21:41Z" Oct 02 13:00:10 crc kubenswrapper[4724]: I1002 13:00:10.903093 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-74k4t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6090eaa-c182-4788-950c-16352c271233\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a3396cd890f4ef8af351a4a3065c2324c20bddc7e079bb78e5a02e9fc8269c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fb6lr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4d5dbca156adf53ca2d3b237e15b6dd4387249e73dbf1a7fb3e7e39c53d1b548\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fb6lr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T12:59:21Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-74k4t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T13:00:10Z is after 2025-08-24T17:21:41Z" Oct 02 13:00:10 crc kubenswrapper[4724]: I1002 13:00:10.903772 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 13:00:10 crc kubenswrapper[4724]: I1002 13:00:10.903818 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 13:00:10 crc kubenswrapper[4724]: I1002 13:00:10.903829 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 13:00:10 crc kubenswrapper[4724]: I1002 13:00:10.903847 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 13:00:10 crc kubenswrapper[4724]: I1002 13:00:10.903858 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T13:00:10Z","lastTransitionTime":"2025-10-02T13:00:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 13:00:10 crc kubenswrapper[4724]: I1002 13:00:10.915514 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-q7t2t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"32e04071-6b34-4fc0-9783-f346a72fcf99\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpmhc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpmhc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T12:59:35Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-q7t2t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T13:00:10Z is after 2025-08-24T17:21:41Z" Oct 02 13:00:10 crc kubenswrapper[4724]: I1002 13:00:10.931124 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1527f3c8-7fa1-404b-aafd-7cac512df49d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:58:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:58:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:58:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f058a4a1cf35b7f4800f872139322f2b0096d67548945ba03ecfe6775ffd5103\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:58:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c06801afff9aa03ae9e0bcd8a966f454d81fdfe876806eed22e9d93132989dae\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:58:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff4c45bdd946a4d50c7dc93752d3d1a71b10b830611b64e138a5e1c0eb9e5e30\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:58:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://70f8bb8f5d842643df282ff4c021f3aeba0de7dddc0864e3c1deaf64e604fccc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a3585f00b9d5a19821986d13b0d46f0db3bdd6eef3b4e28f3a63449e0aa0e55\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-02T12:59:19Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1002 12:59:13.975124 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1002 12:59:13.976423 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1894517521/tls.crt::/tmp/serving-cert-1894517521/tls.key\\\\\\\"\\\\nI1002 12:59:19.332110 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1002 12:59:19.335107 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1002 12:59:19.335132 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1002 12:59:19.335161 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1002 12:59:19.335167 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1002 12:59:19.339404 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1002 12:59:19.339433 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 12:59:19.339437 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 12:59:19.339442 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1002 12:59:19.339446 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1002 12:59:19.339452 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1002 12:59:19.339454 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1002 12:59:19.339646 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1002 12:59:19.342508 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T12:59:02Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://25e74bd1660931f7bb8dc824f985c539abcbd4c706a11edf7192876b8796e751\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:58:59Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://366ed0b9a29ba0b7606888087909b51b39abd60dd6b11e6509b8613913461b6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://366ed0b9a29ba0b7606888087909b51b39abd60dd6b11e6509b8613913461b6b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T12:58:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T12:58:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T12:58:56Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T13:00:10Z is after 2025-08-24T17:21:41Z" Oct 02 13:00:10 crc kubenswrapper[4724]: I1002 13:00:10.949147 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-w58lt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4089ad23-969c-4222-a8ed-e141ec291e80\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://197d101ac60dc4e22bd91f0bccaf0c5d9461d2bb94425b5c6d4ca75aad0f7e25\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sxqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3940d1762b165816101228f10689c4897ca3909b295c51368935e2d28d32e9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sxqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://300ea92580dcf51b685b378397b0dd067899d424f4394bce09a8242415acba64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sxqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6bc33e00e91c1177fe2762cb5295c78f4b00c1861d44446269e2e7668b25ac83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sxqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f4412d5281a9c03ea5b251d17371e363b9c5a970bad0db12adf3f75ddd1f955\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sxqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4186fbd2d282edbe581a4d8d3616dff86c3e6cdbe797999fa57f6034870e7742\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sxqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://28aaaa47f8700a0b8725c20b81e95579d97b1548baea9e9cea1fec062cd76f79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://28aaaa47f8700a0b8725c20b81e95579d97b1548baea9e9cea1fec062cd76f79\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-02T12:59:54Z\\\",\\\"message\\\":\\\"693e1320bd}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1002 12:59:54.044422 6454 obj_retry.go:303] Retry object setup: *v1.Pod openshift-network-console/networking-console-plugin-85b44fc459-gdk6g\\\\nI1002 12:59:54.044420 6454 obj_retry.go:303] Retry object setup: *v1.Pod openshift-network-diagnostics/network-check-source-55646444c4-trplf\\\\nI1002 12:59:54.044433 6454 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-console/networking-console-plugin-85b44fc459-gdk6g\\\\nI1002 12:59:54.044442 6454 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-diagnostics/network-check-source-55646444c4-trplf\\\\nF1002 12:59:54.044455 6454 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T12:59:52Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-w58lt_openshift-ovn-kubernetes(4089ad23-969c-4222-a8ed-e141ec291e80)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sxqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c6cd68ea9e76b6dcb1649fafb808953b30389fcaee674b611c655c239a36e7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sxqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9c7841b20007f0436754e669fb6f1a85de1f0ced1c0b2686ae0a20b56ab878f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9c7841b20007f0436754e669fb6f1a85de1f0ced1c0b2686ae0a20b56ab878f2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T12:59:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T12:59:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sxqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T12:59:21Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-w58lt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T13:00:10Z is after 2025-08-24T17:21:41Z" Oct 02 13:00:10 crc kubenswrapper[4724]: I1002 13:00:10.962726 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8902618f9e9502ff68d0658e199abb68025c551774c41022ed9f3f15dcccba47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T13:00:10Z is after 2025-08-24T17:21:41Z" Oct 02 13:00:10 crc kubenswrapper[4724]: I1002 13:00:10.973866 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T13:00:10Z is after 2025-08-24T17:21:41Z" Oct 02 13:00:10 crc kubenswrapper[4724]: I1002 13:00:10.987867 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:19Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T13:00:10Z is after 2025-08-24T17:21:41Z" Oct 02 13:00:11 crc kubenswrapper[4724]: I1002 13:00:11.002594 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-pr276" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c67e3c27-01fd-47f5-a7fb-a2ae1e7753d4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T13:00:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T13:00:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6c546eb3faa335d37ee1c08a88b2d409a5ee23ed42ca9fd7104fb0bb76ecf15\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5fe9713cb9b004911140ef4802a6755a4d2a781fde439d7318ace93e4b608cef\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-02T13:00:09Z\\\",\\\"message\\\":\\\"2025-10-02T12:59:23+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_52b6ea56-510d-4c4a-86ca-f2d8b03a329f\\\\n2025-10-02T12:59:23+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_52b6ea56-510d-4c4a-86ca-f2d8b03a329f to /host/opt/cni/bin/\\\\n2025-10-02T12:59:24Z [verbose] multus-daemon started\\\\n2025-10-02T12:59:24Z [verbose] Readiness Indicator file check\\\\n2025-10-02T13:00:09Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T12:59:22Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T13:00:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c844q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T12:59:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-pr276\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T13:00:11Z is after 2025-08-24T17:21:41Z" Oct 02 13:00:11 crc kubenswrapper[4724]: I1002 13:00:11.006500 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 13:00:11 crc kubenswrapper[4724]: I1002 13:00:11.006568 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 13:00:11 crc kubenswrapper[4724]: I1002 13:00:11.006579 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 13:00:11 crc kubenswrapper[4724]: I1002 13:00:11.006601 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 13:00:11 crc kubenswrapper[4724]: I1002 13:00:11.006613 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T13:00:11Z","lastTransitionTime":"2025-10-02T13:00:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 13:00:11 crc kubenswrapper[4724]: I1002 13:00:11.013448 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-vv7gr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"58e6c559-83ff-48ec-b337-ddd00852bc3c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://540350e6cb29bb395f31338d08626d3b110cc3adbe0df070045d178281d9c035\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4spmh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T12:59:26Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-vv7gr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T13:00:11Z is after 2025-08-24T17:21:41Z" Oct 02 13:00:11 crc kubenswrapper[4724]: I1002 13:00:11.026970 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-z9692" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aeccba6f-b4bb-4dd5-8ab1-798d7a67251a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://116acf2f64bac5bb00b87c2bf09f16dcec811d4effbacb2ad49e4d2656180b54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znd9t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5971fadd3a2831fd568f1a0e53cefad9dbb4a9c5ad719b4024ec27381842f92e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znd9t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T12:59:33Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-z9692\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T13:00:11Z is after 2025-08-24T17:21:41Z" Oct 02 13:00:11 crc kubenswrapper[4724]: I1002 13:00:11.040302 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"92a0e520-2ed8-4d81-8de4-c0f98916b012\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:58:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:58:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:58:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8d0d24fc1f56963b40bca48e3e923a5b35f9bec48f0b4fabbbdc9163762b2aa6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:58:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cdf23aa5cabd12ba0af299a7184df0341b5a7f7fb6f8159a10c932599fcef08b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:58:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a087f1c52a561564ac405ef652e8cb1542077e507acff22e1189f7370b7ca322\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:58:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a675aca2218d07044b6e9a391f2b60f5cc822790011af97b0c0b704c1bd98d78\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:58:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T12:58:56Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T13:00:11Z is after 2025-08-24T17:21:41Z" Oct 02 13:00:11 crc kubenswrapper[4724]: I1002 13:00:11.054120 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-2mrjk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e0d209f5-ad55-48f5-b4de-51aa5a972c19\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e94572c741d70917793b2482faf4c7fba57dfc4a29ade8824b35109735b602c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8f48\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T12:59:21Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-2mrjk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T13:00:11Z is after 2025-08-24T17:21:41Z" Oct 02 13:00:11 crc kubenswrapper[4724]: I1002 13:00:11.109251 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 13:00:11 crc kubenswrapper[4724]: I1002 13:00:11.109305 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 13:00:11 crc kubenswrapper[4724]: I1002 13:00:11.109318 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 13:00:11 crc kubenswrapper[4724]: I1002 13:00:11.109340 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 13:00:11 crc kubenswrapper[4724]: I1002 13:00:11.109354 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T13:00:11Z","lastTransitionTime":"2025-10-02T13:00:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 13:00:11 crc kubenswrapper[4724]: I1002 13:00:11.212075 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 13:00:11 crc kubenswrapper[4724]: I1002 13:00:11.212118 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 13:00:11 crc kubenswrapper[4724]: I1002 13:00:11.212127 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 13:00:11 crc kubenswrapper[4724]: I1002 13:00:11.212143 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 13:00:11 crc kubenswrapper[4724]: I1002 13:00:11.212156 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T13:00:11Z","lastTransitionTime":"2025-10-02T13:00:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 13:00:11 crc kubenswrapper[4724]: I1002 13:00:11.312964 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 13:00:11 crc kubenswrapper[4724]: I1002 13:00:11.313014 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-q7t2t" Oct 02 13:00:11 crc kubenswrapper[4724]: I1002 13:00:11.313049 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 13:00:11 crc kubenswrapper[4724]: E1002 13:00:11.313130 4724 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 02 13:00:11 crc kubenswrapper[4724]: I1002 13:00:11.312964 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 13:00:11 crc kubenswrapper[4724]: E1002 13:00:11.313417 4724 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 02 13:00:11 crc kubenswrapper[4724]: E1002 13:00:11.313603 4724 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-q7t2t" podUID="32e04071-6b34-4fc0-9783-f346a72fcf99" Oct 02 13:00:11 crc kubenswrapper[4724]: E1002 13:00:11.313819 4724 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 02 13:00:11 crc kubenswrapper[4724]: I1002 13:00:11.315574 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 13:00:11 crc kubenswrapper[4724]: I1002 13:00:11.315616 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 13:00:11 crc kubenswrapper[4724]: I1002 13:00:11.315629 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 13:00:11 crc kubenswrapper[4724]: I1002 13:00:11.315647 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 13:00:11 crc kubenswrapper[4724]: I1002 13:00:11.315658 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T13:00:11Z","lastTransitionTime":"2025-10-02T13:00:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 13:00:11 crc kubenswrapper[4724]: I1002 13:00:11.418282 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 13:00:11 crc kubenswrapper[4724]: I1002 13:00:11.418323 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 13:00:11 crc kubenswrapper[4724]: I1002 13:00:11.418331 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 13:00:11 crc kubenswrapper[4724]: I1002 13:00:11.418347 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 13:00:11 crc kubenswrapper[4724]: I1002 13:00:11.418359 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T13:00:11Z","lastTransitionTime":"2025-10-02T13:00:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 13:00:11 crc kubenswrapper[4724]: I1002 13:00:11.521806 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 13:00:11 crc kubenswrapper[4724]: I1002 13:00:11.521844 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 13:00:11 crc kubenswrapper[4724]: I1002 13:00:11.521855 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 13:00:11 crc kubenswrapper[4724]: I1002 13:00:11.521870 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 13:00:11 crc kubenswrapper[4724]: I1002 13:00:11.521881 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T13:00:11Z","lastTransitionTime":"2025-10-02T13:00:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 13:00:11 crc kubenswrapper[4724]: I1002 13:00:11.624529 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 13:00:11 crc kubenswrapper[4724]: I1002 13:00:11.624601 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 13:00:11 crc kubenswrapper[4724]: I1002 13:00:11.624612 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 13:00:11 crc kubenswrapper[4724]: I1002 13:00:11.624654 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 13:00:11 crc kubenswrapper[4724]: I1002 13:00:11.624672 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T13:00:11Z","lastTransitionTime":"2025-10-02T13:00:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 13:00:11 crc kubenswrapper[4724]: I1002 13:00:11.727370 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 13:00:11 crc kubenswrapper[4724]: I1002 13:00:11.727417 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 13:00:11 crc kubenswrapper[4724]: I1002 13:00:11.727427 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 13:00:11 crc kubenswrapper[4724]: I1002 13:00:11.727444 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 13:00:11 crc kubenswrapper[4724]: I1002 13:00:11.727459 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T13:00:11Z","lastTransitionTime":"2025-10-02T13:00:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 13:00:11 crc kubenswrapper[4724]: I1002 13:00:11.830436 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 13:00:11 crc kubenswrapper[4724]: I1002 13:00:11.830486 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 13:00:11 crc kubenswrapper[4724]: I1002 13:00:11.830498 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 13:00:11 crc kubenswrapper[4724]: I1002 13:00:11.830516 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 13:00:11 crc kubenswrapper[4724]: I1002 13:00:11.830529 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T13:00:11Z","lastTransitionTime":"2025-10-02T13:00:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 13:00:11 crc kubenswrapper[4724]: I1002 13:00:11.933794 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 13:00:11 crc kubenswrapper[4724]: I1002 13:00:11.933826 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 13:00:11 crc kubenswrapper[4724]: I1002 13:00:11.933836 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 13:00:11 crc kubenswrapper[4724]: I1002 13:00:11.933850 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 13:00:11 crc kubenswrapper[4724]: I1002 13:00:11.933861 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T13:00:11Z","lastTransitionTime":"2025-10-02T13:00:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 13:00:12 crc kubenswrapper[4724]: I1002 13:00:12.036157 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 13:00:12 crc kubenswrapper[4724]: I1002 13:00:12.036194 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 13:00:12 crc kubenswrapper[4724]: I1002 13:00:12.036206 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 13:00:12 crc kubenswrapper[4724]: I1002 13:00:12.036224 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 13:00:12 crc kubenswrapper[4724]: I1002 13:00:12.036236 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T13:00:12Z","lastTransitionTime":"2025-10-02T13:00:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 13:00:12 crc kubenswrapper[4724]: I1002 13:00:12.139711 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 13:00:12 crc kubenswrapper[4724]: I1002 13:00:12.139752 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 13:00:12 crc kubenswrapper[4724]: I1002 13:00:12.139764 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 13:00:12 crc kubenswrapper[4724]: I1002 13:00:12.139783 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 13:00:12 crc kubenswrapper[4724]: I1002 13:00:12.139795 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T13:00:12Z","lastTransitionTime":"2025-10-02T13:00:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 13:00:12 crc kubenswrapper[4724]: I1002 13:00:12.243151 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 13:00:12 crc kubenswrapper[4724]: I1002 13:00:12.243405 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 13:00:12 crc kubenswrapper[4724]: I1002 13:00:12.243556 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 13:00:12 crc kubenswrapper[4724]: I1002 13:00:12.243678 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 13:00:12 crc kubenswrapper[4724]: I1002 13:00:12.243768 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T13:00:12Z","lastTransitionTime":"2025-10-02T13:00:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 13:00:12 crc kubenswrapper[4724]: I1002 13:00:12.347237 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 13:00:12 crc kubenswrapper[4724]: I1002 13:00:12.347288 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 13:00:12 crc kubenswrapper[4724]: I1002 13:00:12.347301 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 13:00:12 crc kubenswrapper[4724]: I1002 13:00:12.347323 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 13:00:12 crc kubenswrapper[4724]: I1002 13:00:12.347338 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T13:00:12Z","lastTransitionTime":"2025-10-02T13:00:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 13:00:12 crc kubenswrapper[4724]: I1002 13:00:12.450729 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 13:00:12 crc kubenswrapper[4724]: I1002 13:00:12.451045 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 13:00:12 crc kubenswrapper[4724]: I1002 13:00:12.451132 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 13:00:12 crc kubenswrapper[4724]: I1002 13:00:12.451230 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 13:00:12 crc kubenswrapper[4724]: I1002 13:00:12.451356 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T13:00:12Z","lastTransitionTime":"2025-10-02T13:00:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 13:00:12 crc kubenswrapper[4724]: I1002 13:00:12.554406 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 13:00:12 crc kubenswrapper[4724]: I1002 13:00:12.554449 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 13:00:12 crc kubenswrapper[4724]: I1002 13:00:12.554476 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 13:00:12 crc kubenswrapper[4724]: I1002 13:00:12.554497 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 13:00:12 crc kubenswrapper[4724]: I1002 13:00:12.554506 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T13:00:12Z","lastTransitionTime":"2025-10-02T13:00:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 13:00:12 crc kubenswrapper[4724]: I1002 13:00:12.657019 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 13:00:12 crc kubenswrapper[4724]: I1002 13:00:12.657367 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 13:00:12 crc kubenswrapper[4724]: I1002 13:00:12.657444 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 13:00:12 crc kubenswrapper[4724]: I1002 13:00:12.657510 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 13:00:12 crc kubenswrapper[4724]: I1002 13:00:12.657616 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T13:00:12Z","lastTransitionTime":"2025-10-02T13:00:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 13:00:12 crc kubenswrapper[4724]: I1002 13:00:12.765394 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 13:00:12 crc kubenswrapper[4724]: I1002 13:00:12.765430 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 13:00:12 crc kubenswrapper[4724]: I1002 13:00:12.765442 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 13:00:12 crc kubenswrapper[4724]: I1002 13:00:12.765459 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 13:00:12 crc kubenswrapper[4724]: I1002 13:00:12.765470 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T13:00:12Z","lastTransitionTime":"2025-10-02T13:00:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 13:00:12 crc kubenswrapper[4724]: I1002 13:00:12.867987 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 13:00:12 crc kubenswrapper[4724]: I1002 13:00:12.868089 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 13:00:12 crc kubenswrapper[4724]: I1002 13:00:12.868103 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 13:00:12 crc kubenswrapper[4724]: I1002 13:00:12.868123 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 13:00:12 crc kubenswrapper[4724]: I1002 13:00:12.868135 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T13:00:12Z","lastTransitionTime":"2025-10-02T13:00:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 13:00:12 crc kubenswrapper[4724]: I1002 13:00:12.970877 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 13:00:12 crc kubenswrapper[4724]: I1002 13:00:12.970934 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 13:00:12 crc kubenswrapper[4724]: I1002 13:00:12.970950 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 13:00:12 crc kubenswrapper[4724]: I1002 13:00:12.970973 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 13:00:12 crc kubenswrapper[4724]: I1002 13:00:12.970984 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T13:00:12Z","lastTransitionTime":"2025-10-02T13:00:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 13:00:13 crc kubenswrapper[4724]: I1002 13:00:13.074316 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 13:00:13 crc kubenswrapper[4724]: I1002 13:00:13.074382 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 13:00:13 crc kubenswrapper[4724]: I1002 13:00:13.074392 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 13:00:13 crc kubenswrapper[4724]: I1002 13:00:13.074409 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 13:00:13 crc kubenswrapper[4724]: I1002 13:00:13.074421 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T13:00:13Z","lastTransitionTime":"2025-10-02T13:00:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 13:00:13 crc kubenswrapper[4724]: I1002 13:00:13.177829 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 13:00:13 crc kubenswrapper[4724]: I1002 13:00:13.177882 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 13:00:13 crc kubenswrapper[4724]: I1002 13:00:13.177894 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 13:00:13 crc kubenswrapper[4724]: I1002 13:00:13.177912 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 13:00:13 crc kubenswrapper[4724]: I1002 13:00:13.177930 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T13:00:13Z","lastTransitionTime":"2025-10-02T13:00:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 13:00:13 crc kubenswrapper[4724]: I1002 13:00:13.281138 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 13:00:13 crc kubenswrapper[4724]: I1002 13:00:13.281180 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 13:00:13 crc kubenswrapper[4724]: I1002 13:00:13.281192 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 13:00:13 crc kubenswrapper[4724]: I1002 13:00:13.281213 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 13:00:13 crc kubenswrapper[4724]: I1002 13:00:13.281226 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T13:00:13Z","lastTransitionTime":"2025-10-02T13:00:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 13:00:13 crc kubenswrapper[4724]: I1002 13:00:13.313069 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 13:00:13 crc kubenswrapper[4724]: I1002 13:00:13.313074 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 13:00:13 crc kubenswrapper[4724]: I1002 13:00:13.313134 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 13:00:13 crc kubenswrapper[4724]: I1002 13:00:13.313069 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-q7t2t" Oct 02 13:00:13 crc kubenswrapper[4724]: E1002 13:00:13.313253 4724 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 02 13:00:13 crc kubenswrapper[4724]: E1002 13:00:13.313365 4724 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-q7t2t" podUID="32e04071-6b34-4fc0-9783-f346a72fcf99" Oct 02 13:00:13 crc kubenswrapper[4724]: E1002 13:00:13.313440 4724 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 02 13:00:13 crc kubenswrapper[4724]: E1002 13:00:13.313493 4724 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 02 13:00:13 crc kubenswrapper[4724]: I1002 13:00:13.384285 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 13:00:13 crc kubenswrapper[4724]: I1002 13:00:13.384325 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 13:00:13 crc kubenswrapper[4724]: I1002 13:00:13.384333 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 13:00:13 crc kubenswrapper[4724]: I1002 13:00:13.384347 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 13:00:13 crc kubenswrapper[4724]: I1002 13:00:13.384357 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T13:00:13Z","lastTransitionTime":"2025-10-02T13:00:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 13:00:13 crc kubenswrapper[4724]: I1002 13:00:13.487411 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 13:00:13 crc kubenswrapper[4724]: I1002 13:00:13.487462 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 13:00:13 crc kubenswrapper[4724]: I1002 13:00:13.487478 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 13:00:13 crc kubenswrapper[4724]: I1002 13:00:13.487497 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 13:00:13 crc kubenswrapper[4724]: I1002 13:00:13.487514 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T13:00:13Z","lastTransitionTime":"2025-10-02T13:00:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 13:00:13 crc kubenswrapper[4724]: I1002 13:00:13.590237 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 13:00:13 crc kubenswrapper[4724]: I1002 13:00:13.590279 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 13:00:13 crc kubenswrapper[4724]: I1002 13:00:13.590290 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 13:00:13 crc kubenswrapper[4724]: I1002 13:00:13.590306 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 13:00:13 crc kubenswrapper[4724]: I1002 13:00:13.590318 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T13:00:13Z","lastTransitionTime":"2025-10-02T13:00:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 13:00:13 crc kubenswrapper[4724]: I1002 13:00:13.693608 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 13:00:13 crc kubenswrapper[4724]: I1002 13:00:13.693658 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 13:00:13 crc kubenswrapper[4724]: I1002 13:00:13.693668 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 13:00:13 crc kubenswrapper[4724]: I1002 13:00:13.693687 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 13:00:13 crc kubenswrapper[4724]: I1002 13:00:13.693697 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T13:00:13Z","lastTransitionTime":"2025-10-02T13:00:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 13:00:13 crc kubenswrapper[4724]: I1002 13:00:13.796400 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 13:00:13 crc kubenswrapper[4724]: I1002 13:00:13.796449 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 13:00:13 crc kubenswrapper[4724]: I1002 13:00:13.796459 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 13:00:13 crc kubenswrapper[4724]: I1002 13:00:13.796475 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 13:00:13 crc kubenswrapper[4724]: I1002 13:00:13.796492 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T13:00:13Z","lastTransitionTime":"2025-10-02T13:00:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 13:00:13 crc kubenswrapper[4724]: I1002 13:00:13.899649 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 13:00:13 crc kubenswrapper[4724]: I1002 13:00:13.899700 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 13:00:13 crc kubenswrapper[4724]: I1002 13:00:13.899712 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 13:00:13 crc kubenswrapper[4724]: I1002 13:00:13.899728 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 13:00:13 crc kubenswrapper[4724]: I1002 13:00:13.899739 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T13:00:13Z","lastTransitionTime":"2025-10-02T13:00:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 13:00:14 crc kubenswrapper[4724]: I1002 13:00:14.002461 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 13:00:14 crc kubenswrapper[4724]: I1002 13:00:14.002569 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 13:00:14 crc kubenswrapper[4724]: I1002 13:00:14.002585 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 13:00:14 crc kubenswrapper[4724]: I1002 13:00:14.002604 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 13:00:14 crc kubenswrapper[4724]: I1002 13:00:14.002617 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T13:00:14Z","lastTransitionTime":"2025-10-02T13:00:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 13:00:14 crc kubenswrapper[4724]: I1002 13:00:14.105184 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 13:00:14 crc kubenswrapper[4724]: I1002 13:00:14.105224 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 13:00:14 crc kubenswrapper[4724]: I1002 13:00:14.105232 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 13:00:14 crc kubenswrapper[4724]: I1002 13:00:14.105248 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 13:00:14 crc kubenswrapper[4724]: I1002 13:00:14.105259 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T13:00:14Z","lastTransitionTime":"2025-10-02T13:00:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 13:00:14 crc kubenswrapper[4724]: I1002 13:00:14.207949 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 13:00:14 crc kubenswrapper[4724]: I1002 13:00:14.207993 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 13:00:14 crc kubenswrapper[4724]: I1002 13:00:14.208002 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 13:00:14 crc kubenswrapper[4724]: I1002 13:00:14.208019 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 13:00:14 crc kubenswrapper[4724]: I1002 13:00:14.208030 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T13:00:14Z","lastTransitionTime":"2025-10-02T13:00:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 13:00:14 crc kubenswrapper[4724]: I1002 13:00:14.310656 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 13:00:14 crc kubenswrapper[4724]: I1002 13:00:14.310692 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 13:00:14 crc kubenswrapper[4724]: I1002 13:00:14.310701 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 13:00:14 crc kubenswrapper[4724]: I1002 13:00:14.310716 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 13:00:14 crc kubenswrapper[4724]: I1002 13:00:14.310726 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T13:00:14Z","lastTransitionTime":"2025-10-02T13:00:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 13:00:14 crc kubenswrapper[4724]: I1002 13:00:14.413681 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 13:00:14 crc kubenswrapper[4724]: I1002 13:00:14.413738 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 13:00:14 crc kubenswrapper[4724]: I1002 13:00:14.413751 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 13:00:14 crc kubenswrapper[4724]: I1002 13:00:14.413773 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 13:00:14 crc kubenswrapper[4724]: I1002 13:00:14.413785 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T13:00:14Z","lastTransitionTime":"2025-10-02T13:00:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 13:00:14 crc kubenswrapper[4724]: I1002 13:00:14.516791 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 13:00:14 crc kubenswrapper[4724]: I1002 13:00:14.516839 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 13:00:14 crc kubenswrapper[4724]: I1002 13:00:14.516850 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 13:00:14 crc kubenswrapper[4724]: I1002 13:00:14.516867 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 13:00:14 crc kubenswrapper[4724]: I1002 13:00:14.516878 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T13:00:14Z","lastTransitionTime":"2025-10-02T13:00:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 13:00:14 crc kubenswrapper[4724]: I1002 13:00:14.619200 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 13:00:14 crc kubenswrapper[4724]: I1002 13:00:14.619242 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 13:00:14 crc kubenswrapper[4724]: I1002 13:00:14.619251 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 13:00:14 crc kubenswrapper[4724]: I1002 13:00:14.619267 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 13:00:14 crc kubenswrapper[4724]: I1002 13:00:14.619277 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T13:00:14Z","lastTransitionTime":"2025-10-02T13:00:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 13:00:14 crc kubenswrapper[4724]: I1002 13:00:14.722801 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 13:00:14 crc kubenswrapper[4724]: I1002 13:00:14.722842 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 13:00:14 crc kubenswrapper[4724]: I1002 13:00:14.722853 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 13:00:14 crc kubenswrapper[4724]: I1002 13:00:14.722879 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 13:00:14 crc kubenswrapper[4724]: I1002 13:00:14.722889 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T13:00:14Z","lastTransitionTime":"2025-10-02T13:00:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 13:00:14 crc kubenswrapper[4724]: I1002 13:00:14.861699 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 13:00:14 crc kubenswrapper[4724]: I1002 13:00:14.861750 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 13:00:14 crc kubenswrapper[4724]: I1002 13:00:14.861764 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 13:00:14 crc kubenswrapper[4724]: I1002 13:00:14.861785 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 13:00:14 crc kubenswrapper[4724]: I1002 13:00:14.861799 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T13:00:14Z","lastTransitionTime":"2025-10-02T13:00:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 13:00:14 crc kubenswrapper[4724]: I1002 13:00:14.964830 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 13:00:14 crc kubenswrapper[4724]: I1002 13:00:14.964870 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 13:00:14 crc kubenswrapper[4724]: I1002 13:00:14.964884 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 13:00:14 crc kubenswrapper[4724]: I1002 13:00:14.964901 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 13:00:14 crc kubenswrapper[4724]: I1002 13:00:14.964914 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T13:00:14Z","lastTransitionTime":"2025-10-02T13:00:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 13:00:15 crc kubenswrapper[4724]: I1002 13:00:15.068366 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 13:00:15 crc kubenswrapper[4724]: I1002 13:00:15.068410 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 13:00:15 crc kubenswrapper[4724]: I1002 13:00:15.068423 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 13:00:15 crc kubenswrapper[4724]: I1002 13:00:15.068444 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 13:00:15 crc kubenswrapper[4724]: I1002 13:00:15.068457 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T13:00:15Z","lastTransitionTime":"2025-10-02T13:00:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 13:00:15 crc kubenswrapper[4724]: I1002 13:00:15.171321 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 13:00:15 crc kubenswrapper[4724]: I1002 13:00:15.171369 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 13:00:15 crc kubenswrapper[4724]: I1002 13:00:15.171380 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 13:00:15 crc kubenswrapper[4724]: I1002 13:00:15.171398 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 13:00:15 crc kubenswrapper[4724]: I1002 13:00:15.171411 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T13:00:15Z","lastTransitionTime":"2025-10-02T13:00:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 13:00:15 crc kubenswrapper[4724]: I1002 13:00:15.274293 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 13:00:15 crc kubenswrapper[4724]: I1002 13:00:15.274344 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 13:00:15 crc kubenswrapper[4724]: I1002 13:00:15.274352 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 13:00:15 crc kubenswrapper[4724]: I1002 13:00:15.274371 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 13:00:15 crc kubenswrapper[4724]: I1002 13:00:15.274382 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T13:00:15Z","lastTransitionTime":"2025-10-02T13:00:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 13:00:15 crc kubenswrapper[4724]: I1002 13:00:15.312798 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 13:00:15 crc kubenswrapper[4724]: I1002 13:00:15.312878 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-q7t2t" Oct 02 13:00:15 crc kubenswrapper[4724]: I1002 13:00:15.312901 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 13:00:15 crc kubenswrapper[4724]: E1002 13:00:15.313154 4724 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 02 13:00:15 crc kubenswrapper[4724]: I1002 13:00:15.313192 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 13:00:15 crc kubenswrapper[4724]: E1002 13:00:15.313300 4724 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-q7t2t" podUID="32e04071-6b34-4fc0-9783-f346a72fcf99" Oct 02 13:00:15 crc kubenswrapper[4724]: E1002 13:00:15.313409 4724 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 02 13:00:15 crc kubenswrapper[4724]: E1002 13:00:15.313625 4724 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 02 13:00:15 crc kubenswrapper[4724]: I1002 13:00:15.377868 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 13:00:15 crc kubenswrapper[4724]: I1002 13:00:15.377917 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 13:00:15 crc kubenswrapper[4724]: I1002 13:00:15.377926 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 13:00:15 crc kubenswrapper[4724]: I1002 13:00:15.377943 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 13:00:15 crc kubenswrapper[4724]: I1002 13:00:15.377955 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T13:00:15Z","lastTransitionTime":"2025-10-02T13:00:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 13:00:15 crc kubenswrapper[4724]: I1002 13:00:15.481400 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 13:00:15 crc kubenswrapper[4724]: I1002 13:00:15.481452 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 13:00:15 crc kubenswrapper[4724]: I1002 13:00:15.481470 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 13:00:15 crc kubenswrapper[4724]: I1002 13:00:15.481490 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 13:00:15 crc kubenswrapper[4724]: I1002 13:00:15.481505 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T13:00:15Z","lastTransitionTime":"2025-10-02T13:00:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 13:00:15 crc kubenswrapper[4724]: I1002 13:00:15.584121 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 13:00:15 crc kubenswrapper[4724]: I1002 13:00:15.584173 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 13:00:15 crc kubenswrapper[4724]: I1002 13:00:15.584185 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 13:00:15 crc kubenswrapper[4724]: I1002 13:00:15.584204 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 13:00:15 crc kubenswrapper[4724]: I1002 13:00:15.584218 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T13:00:15Z","lastTransitionTime":"2025-10-02T13:00:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 13:00:15 crc kubenswrapper[4724]: I1002 13:00:15.687218 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 13:00:15 crc kubenswrapper[4724]: I1002 13:00:15.687275 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 13:00:15 crc kubenswrapper[4724]: I1002 13:00:15.687284 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 13:00:15 crc kubenswrapper[4724]: I1002 13:00:15.687303 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 13:00:15 crc kubenswrapper[4724]: I1002 13:00:15.687315 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T13:00:15Z","lastTransitionTime":"2025-10-02T13:00:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 13:00:15 crc kubenswrapper[4724]: I1002 13:00:15.790911 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 13:00:15 crc kubenswrapper[4724]: I1002 13:00:15.790962 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 13:00:15 crc kubenswrapper[4724]: I1002 13:00:15.790971 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 13:00:15 crc kubenswrapper[4724]: I1002 13:00:15.790986 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 13:00:15 crc kubenswrapper[4724]: I1002 13:00:15.790997 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T13:00:15Z","lastTransitionTime":"2025-10-02T13:00:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 13:00:15 crc kubenswrapper[4724]: I1002 13:00:15.893491 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 13:00:15 crc kubenswrapper[4724]: I1002 13:00:15.893558 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 13:00:15 crc kubenswrapper[4724]: I1002 13:00:15.893571 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 13:00:15 crc kubenswrapper[4724]: I1002 13:00:15.893589 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 13:00:15 crc kubenswrapper[4724]: I1002 13:00:15.893601 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T13:00:15Z","lastTransitionTime":"2025-10-02T13:00:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 13:00:15 crc kubenswrapper[4724]: I1002 13:00:15.995777 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 13:00:15 crc kubenswrapper[4724]: I1002 13:00:15.995828 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 13:00:15 crc kubenswrapper[4724]: I1002 13:00:15.995842 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 13:00:15 crc kubenswrapper[4724]: I1002 13:00:15.995860 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 13:00:15 crc kubenswrapper[4724]: I1002 13:00:15.995874 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T13:00:15Z","lastTransitionTime":"2025-10-02T13:00:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 13:00:16 crc kubenswrapper[4724]: I1002 13:00:16.099018 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 13:00:16 crc kubenswrapper[4724]: I1002 13:00:16.099097 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 13:00:16 crc kubenswrapper[4724]: I1002 13:00:16.099117 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 13:00:16 crc kubenswrapper[4724]: I1002 13:00:16.099137 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 13:00:16 crc kubenswrapper[4724]: I1002 13:00:16.099150 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T13:00:16Z","lastTransitionTime":"2025-10-02T13:00:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 13:00:16 crc kubenswrapper[4724]: I1002 13:00:16.201945 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 13:00:16 crc kubenswrapper[4724]: I1002 13:00:16.202002 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 13:00:16 crc kubenswrapper[4724]: I1002 13:00:16.202014 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 13:00:16 crc kubenswrapper[4724]: I1002 13:00:16.202032 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 13:00:16 crc kubenswrapper[4724]: I1002 13:00:16.202046 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T13:00:16Z","lastTransitionTime":"2025-10-02T13:00:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 13:00:16 crc kubenswrapper[4724]: I1002 13:00:16.304510 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 13:00:16 crc kubenswrapper[4724]: I1002 13:00:16.304624 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 13:00:16 crc kubenswrapper[4724]: I1002 13:00:16.304643 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 13:00:16 crc kubenswrapper[4724]: I1002 13:00:16.304668 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 13:00:16 crc kubenswrapper[4724]: I1002 13:00:16.304718 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T13:00:16Z","lastTransitionTime":"2025-10-02T13:00:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 13:00:16 crc kubenswrapper[4724]: I1002 13:00:16.324403 4724 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc"] Oct 02 13:00:16 crc kubenswrapper[4724]: I1002 13:00:16.327717 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1527f3c8-7fa1-404b-aafd-7cac512df49d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:58:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:58:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:58:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f058a4a1cf35b7f4800f872139322f2b0096d67548945ba03ecfe6775ffd5103\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:58:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c06801afff9aa03ae9e0bcd8a966f454d81fdfe876806eed22e9d93132989dae\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:58:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff4c45bdd946a4d50c7dc93752d3d1a71b10b830611b64e138a5e1c0eb9e5e30\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:58:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://70f8bb8f5d842643df282ff4c021f3aeba0de7dddc0864e3c1deaf64e604fccc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a3585f00b9d5a19821986d13b0d46f0db3bdd6eef3b4e28f3a63449e0aa0e55\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-02T12:59:19Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1002 12:59:13.975124 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1002 12:59:13.976423 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1894517521/tls.crt::/tmp/serving-cert-1894517521/tls.key\\\\\\\"\\\\nI1002 12:59:19.332110 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1002 12:59:19.335107 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1002 12:59:19.335132 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1002 12:59:19.335161 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1002 12:59:19.335167 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1002 12:59:19.339404 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1002 12:59:19.339433 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 12:59:19.339437 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 12:59:19.339442 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1002 12:59:19.339446 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1002 12:59:19.339452 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1002 12:59:19.339454 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1002 12:59:19.339646 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1002 12:59:19.342508 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T12:59:02Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://25e74bd1660931f7bb8dc824f985c539abcbd4c706a11edf7192876b8796e751\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:58:59Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://366ed0b9a29ba0b7606888087909b51b39abd60dd6b11e6509b8613913461b6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://366ed0b9a29ba0b7606888087909b51b39abd60dd6b11e6509b8613913461b6b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T12:58:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T12:58:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T12:58:56Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T13:00:16Z is after 2025-08-24T17:21:41Z" Oct 02 13:00:16 crc kubenswrapper[4724]: I1002 13:00:16.350488 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-w58lt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4089ad23-969c-4222-a8ed-e141ec291e80\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://197d101ac60dc4e22bd91f0bccaf0c5d9461d2bb94425b5c6d4ca75aad0f7e25\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sxqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3940d1762b165816101228f10689c4897ca3909b295c51368935e2d28d32e9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sxqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://300ea92580dcf51b685b378397b0dd067899d424f4394bce09a8242415acba64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sxqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6bc33e00e91c1177fe2762cb5295c78f4b00c1861d44446269e2e7668b25ac83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sxqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f4412d5281a9c03ea5b251d17371e363b9c5a970bad0db12adf3f75ddd1f955\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sxqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4186fbd2d282edbe581a4d8d3616dff86c3e6cdbe797999fa57f6034870e7742\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sxqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://28aaaa47f8700a0b8725c20b81e95579d97b1548baea9e9cea1fec062cd76f79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://28aaaa47f8700a0b8725c20b81e95579d97b1548baea9e9cea1fec062cd76f79\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-02T12:59:54Z\\\",\\\"message\\\":\\\"693e1320bd}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1002 12:59:54.044422 6454 obj_retry.go:303] Retry object setup: *v1.Pod openshift-network-console/networking-console-plugin-85b44fc459-gdk6g\\\\nI1002 12:59:54.044420 6454 obj_retry.go:303] Retry object setup: *v1.Pod openshift-network-diagnostics/network-check-source-55646444c4-trplf\\\\nI1002 12:59:54.044433 6454 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-console/networking-console-plugin-85b44fc459-gdk6g\\\\nI1002 12:59:54.044442 6454 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-diagnostics/network-check-source-55646444c4-trplf\\\\nF1002 12:59:54.044455 6454 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T12:59:52Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-w58lt_openshift-ovn-kubernetes(4089ad23-969c-4222-a8ed-e141ec291e80)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sxqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c6cd68ea9e76b6dcb1649fafb808953b30389fcaee674b611c655c239a36e7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sxqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9c7841b20007f0436754e669fb6f1a85de1f0ced1c0b2686ae0a20b56ab878f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9c7841b20007f0436754e669fb6f1a85de1f0ced1c0b2686ae0a20b56ab878f2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T12:59:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T12:59:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sxqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T12:59:21Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-w58lt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T13:00:16Z is after 2025-08-24T17:21:41Z" Oct 02 13:00:16 crc kubenswrapper[4724]: I1002 13:00:16.367642 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-pr276" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c67e3c27-01fd-47f5-a7fb-a2ae1e7753d4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T13:00:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T13:00:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6c546eb3faa335d37ee1c08a88b2d409a5ee23ed42ca9fd7104fb0bb76ecf15\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5fe9713cb9b004911140ef4802a6755a4d2a781fde439d7318ace93e4b608cef\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-02T13:00:09Z\\\",\\\"message\\\":\\\"2025-10-02T12:59:23+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_52b6ea56-510d-4c4a-86ca-f2d8b03a329f\\\\n2025-10-02T12:59:23+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_52b6ea56-510d-4c4a-86ca-f2d8b03a329f to /host/opt/cni/bin/\\\\n2025-10-02T12:59:24Z [verbose] multus-daemon started\\\\n2025-10-02T12:59:24Z [verbose] Readiness Indicator file check\\\\n2025-10-02T13:00:09Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T12:59:22Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T13:00:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c844q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T12:59:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-pr276\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T13:00:16Z is after 2025-08-24T17:21:41Z" Oct 02 13:00:16 crc kubenswrapper[4724]: I1002 13:00:16.378808 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-vv7gr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"58e6c559-83ff-48ec-b337-ddd00852bc3c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://540350e6cb29bb395f31338d08626d3b110cc3adbe0df070045d178281d9c035\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4spmh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T12:59:26Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-vv7gr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T13:00:16Z is after 2025-08-24T17:21:41Z" Oct 02 13:00:16 crc kubenswrapper[4724]: I1002 13:00:16.390828 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-z9692" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aeccba6f-b4bb-4dd5-8ab1-798d7a67251a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://116acf2f64bac5bb00b87c2bf09f16dcec811d4effbacb2ad49e4d2656180b54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znd9t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5971fadd3a2831fd568f1a0e53cefad9dbb4a9c5ad719b4024ec27381842f92e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znd9t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T12:59:33Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-z9692\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T13:00:16Z is after 2025-08-24T17:21:41Z" Oct 02 13:00:16 crc kubenswrapper[4724]: I1002 13:00:16.406425 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8902618f9e9502ff68d0658e199abb68025c551774c41022ed9f3f15dcccba47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T13:00:16Z is after 2025-08-24T17:21:41Z" Oct 02 13:00:16 crc kubenswrapper[4724]: I1002 13:00:16.406762 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 13:00:16 crc kubenswrapper[4724]: I1002 13:00:16.406831 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 13:00:16 crc kubenswrapper[4724]: I1002 13:00:16.406848 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 13:00:16 crc kubenswrapper[4724]: I1002 13:00:16.406871 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 13:00:16 crc kubenswrapper[4724]: I1002 13:00:16.406887 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T13:00:16Z","lastTransitionTime":"2025-10-02T13:00:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 13:00:16 crc kubenswrapper[4724]: I1002 13:00:16.420411 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T13:00:16Z is after 2025-08-24T17:21:41Z" Oct 02 13:00:16 crc kubenswrapper[4724]: I1002 13:00:16.432244 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:19Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T13:00:16Z is after 2025-08-24T17:21:41Z" Oct 02 13:00:16 crc kubenswrapper[4724]: I1002 13:00:16.445270 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"92a0e520-2ed8-4d81-8de4-c0f98916b012\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:58:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:58:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:58:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8d0d24fc1f56963b40bca48e3e923a5b35f9bec48f0b4fabbbdc9163762b2aa6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:58:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cdf23aa5cabd12ba0af299a7184df0341b5a7f7fb6f8159a10c932599fcef08b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:58:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a087f1c52a561564ac405ef652e8cb1542077e507acff22e1189f7370b7ca322\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:58:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a675aca2218d07044b6e9a391f2b60f5cc822790011af97b0c0b704c1bd98d78\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:58:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T12:58:56Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T13:00:16Z is after 2025-08-24T17:21:41Z" Oct 02 13:00:16 crc kubenswrapper[4724]: I1002 13:00:16.454704 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-2mrjk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e0d209f5-ad55-48f5-b4de-51aa5a972c19\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e94572c741d70917793b2482faf4c7fba57dfc4a29ade8824b35109735b602c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8f48\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T12:59:21Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-2mrjk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T13:00:16Z is after 2025-08-24T17:21:41Z" Oct 02 13:00:16 crc kubenswrapper[4724]: I1002 13:00:16.469963 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-829dv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b90b9e02-3565-4ad5-8f8c-eec339fc499c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://12d1d4558bef1a0a974bf27f9547c646e216ed5001227ab980e89b05c2a7b5ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vjwcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://55a18158dd67ef4667e4a6bc9684c4f4824f848b6e151a0f99c73e5ecfcc5ab1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://55a18158dd67ef4667e4a6bc9684c4f4824f848b6e151a0f99c73e5ecfcc5ab1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T12:59:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T12:59:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vjwcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://db711940b17fde8f8a00b44de8366688eb730facc06a174bf09456c12c673eb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://db711940b17fde8f8a00b44de8366688eb730facc06a174bf09456c12c673eb0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T12:59:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T12:59:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vjwcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a04ebb8ce267454f3068833092d9644e22484d6d3ea6c89e20fe396e5ea1fe9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a04ebb8ce267454f3068833092d9644e22484d6d3ea6c89e20fe396e5ea1fe9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T12:59:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T12:59:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vjwcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://771dcee276aac3571cff4769668348ea4a10d51e792d1604a47c57edac172dec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://771dcee276aac3571cff4769668348ea4a10d51e792d1604a47c57edac172dec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T12:59:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T12:59:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vjwcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c8b3f0e84cfea43942d32d84b09db7348002648527ea7d65a716489f8507eeb5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c8b3f0e84cfea43942d32d84b09db7348002648527ea7d65a716489f8507eeb5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T12:59:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T12:59:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vjwcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1271c0f5d096f2fc134e178807b44e3a5980ca67d64f0e0d73f566b348206a55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1271c0f5d096f2fc134e178807b44e3a5980ca67d64f0e0d73f566b348206a55\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T12:59:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T12:59:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vjwcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T12:59:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-829dv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T13:00:16Z is after 2025-08-24T17:21:41Z" Oct 02 13:00:16 crc kubenswrapper[4724]: I1002 13:00:16.484825 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-74k4t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6090eaa-c182-4788-950c-16352c271233\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a3396cd890f4ef8af351a4a3065c2324c20bddc7e079bb78e5a02e9fc8269c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fb6lr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4d5dbca156adf53ca2d3b237e15b6dd4387249e73dbf1a7fb3e7e39c53d1b548\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fb6lr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T12:59:21Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-74k4t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T13:00:16Z is after 2025-08-24T17:21:41Z" Oct 02 13:00:16 crc kubenswrapper[4724]: I1002 13:00:16.499247 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-q7t2t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"32e04071-6b34-4fc0-9783-f346a72fcf99\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpmhc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpmhc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T12:59:35Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-q7t2t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T13:00:16Z is after 2025-08-24T17:21:41Z" Oct 02 13:00:16 crc kubenswrapper[4724]: I1002 13:00:16.513658 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 13:00:16 crc kubenswrapper[4724]: I1002 13:00:16.513728 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 13:00:16 crc kubenswrapper[4724]: I1002 13:00:16.513745 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 13:00:16 crc kubenswrapper[4724]: I1002 13:00:16.513770 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 13:00:16 crc kubenswrapper[4724]: I1002 13:00:16.513790 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T13:00:16Z","lastTransitionTime":"2025-10-02T13:00:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 13:00:16 crc kubenswrapper[4724]: I1002 13:00:16.515635 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7cb9bd1f-850c-4e5f-bcd0-cd9cdca06604\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:58:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:58:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:58:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4c3fabef61b20063aa2de1008b2b1fda94f39ddd4d0ad905d1a954955ffff7d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:58:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d7393e9d92fc40b10b30d788707b77d3c3e28fb075c618da4c3575dc0a1c1a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:58:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7459703a380feaeff882176dc352b69d3e13089a18d5da98bcebd41018d32091\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:58:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a8e76f0ac3479d575a0b073c5b9ad857711fc06bca31e147167babe0fb164614\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a8e76f0ac3479d575a0b073c5b9ad857711fc06bca31e147167babe0fb164614\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T12:58:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T12:58:57Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T12:58:56Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T13:00:16Z is after 2025-08-24T17:21:41Z" Oct 02 13:00:16 crc kubenswrapper[4724]: I1002 13:00:16.529196 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df76441bbfa05afa242a77e10e6f855d811a9432b790630a9fa5f359ccb6d457\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T13:00:16Z is after 2025-08-24T17:21:41Z" Oct 02 13:00:16 crc kubenswrapper[4724]: I1002 13:00:16.543740 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://16ca55ea7e67fb6219b1c835d960d1f9ec6a18a98789ee961e44a278fad9add7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://72eefb1c1aad15d7ad34aa4384c7d0a20a4c886225cb7d28f0597cf01ad35693\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T13:00:16Z is after 2025-08-24T17:21:41Z" Oct 02 13:00:16 crc kubenswrapper[4724]: I1002 13:00:16.556632 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:19Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T13:00:16Z is after 2025-08-24T17:21:41Z" Oct 02 13:00:16 crc kubenswrapper[4724]: I1002 13:00:16.616246 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 13:00:16 crc kubenswrapper[4724]: I1002 13:00:16.616291 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 13:00:16 crc kubenswrapper[4724]: I1002 13:00:16.616303 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 13:00:16 crc kubenswrapper[4724]: I1002 13:00:16.616319 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 13:00:16 crc kubenswrapper[4724]: I1002 13:00:16.616329 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T13:00:16Z","lastTransitionTime":"2025-10-02T13:00:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 13:00:16 crc kubenswrapper[4724]: I1002 13:00:16.718514 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 13:00:16 crc kubenswrapper[4724]: I1002 13:00:16.718598 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 13:00:16 crc kubenswrapper[4724]: I1002 13:00:16.718609 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 13:00:16 crc kubenswrapper[4724]: I1002 13:00:16.718629 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 13:00:16 crc kubenswrapper[4724]: I1002 13:00:16.718640 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T13:00:16Z","lastTransitionTime":"2025-10-02T13:00:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 13:00:16 crc kubenswrapper[4724]: I1002 13:00:16.822956 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 13:00:16 crc kubenswrapper[4724]: I1002 13:00:16.823009 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 13:00:16 crc kubenswrapper[4724]: I1002 13:00:16.823019 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 13:00:16 crc kubenswrapper[4724]: I1002 13:00:16.823040 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 13:00:16 crc kubenswrapper[4724]: I1002 13:00:16.823051 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T13:00:16Z","lastTransitionTime":"2025-10-02T13:00:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 13:00:16 crc kubenswrapper[4724]: I1002 13:00:16.925984 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 13:00:16 crc kubenswrapper[4724]: I1002 13:00:16.926023 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 13:00:16 crc kubenswrapper[4724]: I1002 13:00:16.926032 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 13:00:16 crc kubenswrapper[4724]: I1002 13:00:16.926052 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 13:00:16 crc kubenswrapper[4724]: I1002 13:00:16.926065 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T13:00:16Z","lastTransitionTime":"2025-10-02T13:00:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 13:00:17 crc kubenswrapper[4724]: I1002 13:00:17.030105 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 13:00:17 crc kubenswrapper[4724]: I1002 13:00:17.030167 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 13:00:17 crc kubenswrapper[4724]: I1002 13:00:17.030177 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 13:00:17 crc kubenswrapper[4724]: I1002 13:00:17.030199 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 13:00:17 crc kubenswrapper[4724]: I1002 13:00:17.030212 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T13:00:17Z","lastTransitionTime":"2025-10-02T13:00:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 13:00:17 crc kubenswrapper[4724]: I1002 13:00:17.133707 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 13:00:17 crc kubenswrapper[4724]: I1002 13:00:17.133765 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 13:00:17 crc kubenswrapper[4724]: I1002 13:00:17.133782 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 13:00:17 crc kubenswrapper[4724]: I1002 13:00:17.133802 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 13:00:17 crc kubenswrapper[4724]: I1002 13:00:17.133815 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T13:00:17Z","lastTransitionTime":"2025-10-02T13:00:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 13:00:17 crc kubenswrapper[4724]: I1002 13:00:17.236830 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 13:00:17 crc kubenswrapper[4724]: I1002 13:00:17.236893 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 13:00:17 crc kubenswrapper[4724]: I1002 13:00:17.236906 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 13:00:17 crc kubenswrapper[4724]: I1002 13:00:17.236926 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 13:00:17 crc kubenswrapper[4724]: I1002 13:00:17.236937 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T13:00:17Z","lastTransitionTime":"2025-10-02T13:00:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 13:00:17 crc kubenswrapper[4724]: I1002 13:00:17.313270 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 13:00:17 crc kubenswrapper[4724]: I1002 13:00:17.313409 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 13:00:17 crc kubenswrapper[4724]: I1002 13:00:17.313313 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 13:00:17 crc kubenswrapper[4724]: I1002 13:00:17.313270 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-q7t2t" Oct 02 13:00:17 crc kubenswrapper[4724]: E1002 13:00:17.313448 4724 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 02 13:00:17 crc kubenswrapper[4724]: E1002 13:00:17.313636 4724 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 02 13:00:17 crc kubenswrapper[4724]: E1002 13:00:17.313752 4724 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-q7t2t" podUID="32e04071-6b34-4fc0-9783-f346a72fcf99" Oct 02 13:00:17 crc kubenswrapper[4724]: E1002 13:00:17.314010 4724 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 02 13:00:17 crc kubenswrapper[4724]: I1002 13:00:17.339007 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 13:00:17 crc kubenswrapper[4724]: I1002 13:00:17.339045 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 13:00:17 crc kubenswrapper[4724]: I1002 13:00:17.339055 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 13:00:17 crc kubenswrapper[4724]: I1002 13:00:17.339086 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 13:00:17 crc kubenswrapper[4724]: I1002 13:00:17.339098 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T13:00:17Z","lastTransitionTime":"2025-10-02T13:00:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 13:00:17 crc kubenswrapper[4724]: I1002 13:00:17.442394 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 13:00:17 crc kubenswrapper[4724]: I1002 13:00:17.442448 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 13:00:17 crc kubenswrapper[4724]: I1002 13:00:17.442459 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 13:00:17 crc kubenswrapper[4724]: I1002 13:00:17.442477 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 13:00:17 crc kubenswrapper[4724]: I1002 13:00:17.442488 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T13:00:17Z","lastTransitionTime":"2025-10-02T13:00:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 13:00:17 crc kubenswrapper[4724]: I1002 13:00:17.545351 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 13:00:17 crc kubenswrapper[4724]: I1002 13:00:17.545387 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 13:00:17 crc kubenswrapper[4724]: I1002 13:00:17.545395 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 13:00:17 crc kubenswrapper[4724]: I1002 13:00:17.545409 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 13:00:17 crc kubenswrapper[4724]: I1002 13:00:17.545419 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T13:00:17Z","lastTransitionTime":"2025-10-02T13:00:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 13:00:17 crc kubenswrapper[4724]: I1002 13:00:17.645060 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 13:00:17 crc kubenswrapper[4724]: I1002 13:00:17.645127 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 13:00:17 crc kubenswrapper[4724]: I1002 13:00:17.645145 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 13:00:17 crc kubenswrapper[4724]: I1002 13:00:17.645167 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 13:00:17 crc kubenswrapper[4724]: I1002 13:00:17.645178 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T13:00:17Z","lastTransitionTime":"2025-10-02T13:00:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 13:00:17 crc kubenswrapper[4724]: E1002 13:00:17.656978 4724 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T13:00:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T13:00:17Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T13:00:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T13:00:17Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T13:00:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T13:00:17Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T13:00:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T13:00:17Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"1650a7ad-7086-4fb1-9f9b-b9368db16424\\\",\\\"systemUUID\\\":\\\"5e560baf-345b-4d65-984c-1cfbf6a74dd2\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T13:00:17Z is after 2025-08-24T17:21:41Z" Oct 02 13:00:17 crc kubenswrapper[4724]: I1002 13:00:17.660923 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 13:00:17 crc kubenswrapper[4724]: I1002 13:00:17.660964 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 13:00:17 crc kubenswrapper[4724]: I1002 13:00:17.660973 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 13:00:17 crc kubenswrapper[4724]: I1002 13:00:17.660990 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 13:00:17 crc kubenswrapper[4724]: I1002 13:00:17.661000 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T13:00:17Z","lastTransitionTime":"2025-10-02T13:00:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 13:00:17 crc kubenswrapper[4724]: E1002 13:00:17.671813 4724 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T13:00:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T13:00:17Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T13:00:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T13:00:17Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T13:00:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T13:00:17Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T13:00:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T13:00:17Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"1650a7ad-7086-4fb1-9f9b-b9368db16424\\\",\\\"systemUUID\\\":\\\"5e560baf-345b-4d65-984c-1cfbf6a74dd2\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T13:00:17Z is after 2025-08-24T17:21:41Z" Oct 02 13:00:17 crc kubenswrapper[4724]: I1002 13:00:17.675432 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 13:00:17 crc kubenswrapper[4724]: I1002 13:00:17.675483 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 13:00:17 crc kubenswrapper[4724]: I1002 13:00:17.675498 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 13:00:17 crc kubenswrapper[4724]: I1002 13:00:17.675519 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 13:00:17 crc kubenswrapper[4724]: I1002 13:00:17.675565 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T13:00:17Z","lastTransitionTime":"2025-10-02T13:00:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 13:00:17 crc kubenswrapper[4724]: E1002 13:00:17.687554 4724 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T13:00:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T13:00:17Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T13:00:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T13:00:17Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T13:00:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T13:00:17Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T13:00:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T13:00:17Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"1650a7ad-7086-4fb1-9f9b-b9368db16424\\\",\\\"systemUUID\\\":\\\"5e560baf-345b-4d65-984c-1cfbf6a74dd2\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T13:00:17Z is after 2025-08-24T17:21:41Z" Oct 02 13:00:17 crc kubenswrapper[4724]: I1002 13:00:17.691297 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 13:00:17 crc kubenswrapper[4724]: I1002 13:00:17.691548 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 13:00:17 crc kubenswrapper[4724]: I1002 13:00:17.691623 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 13:00:17 crc kubenswrapper[4724]: I1002 13:00:17.691710 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 13:00:17 crc kubenswrapper[4724]: I1002 13:00:17.691785 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T13:00:17Z","lastTransitionTime":"2025-10-02T13:00:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 13:00:17 crc kubenswrapper[4724]: E1002 13:00:17.703503 4724 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T13:00:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T13:00:17Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T13:00:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T13:00:17Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T13:00:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T13:00:17Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T13:00:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T13:00:17Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"1650a7ad-7086-4fb1-9f9b-b9368db16424\\\",\\\"systemUUID\\\":\\\"5e560baf-345b-4d65-984c-1cfbf6a74dd2\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T13:00:17Z is after 2025-08-24T17:21:41Z" Oct 02 13:00:17 crc kubenswrapper[4724]: I1002 13:00:17.707047 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 13:00:17 crc kubenswrapper[4724]: I1002 13:00:17.707081 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 13:00:17 crc kubenswrapper[4724]: I1002 13:00:17.707091 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 13:00:17 crc kubenswrapper[4724]: I1002 13:00:17.707111 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 13:00:17 crc kubenswrapper[4724]: I1002 13:00:17.707122 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T13:00:17Z","lastTransitionTime":"2025-10-02T13:00:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 13:00:17 crc kubenswrapper[4724]: E1002 13:00:17.719168 4724 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T13:00:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T13:00:17Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T13:00:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T13:00:17Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T13:00:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T13:00:17Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T13:00:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T13:00:17Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"1650a7ad-7086-4fb1-9f9b-b9368db16424\\\",\\\"systemUUID\\\":\\\"5e560baf-345b-4d65-984c-1cfbf6a74dd2\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T13:00:17Z is after 2025-08-24T17:21:41Z" Oct 02 13:00:17 crc kubenswrapper[4724]: E1002 13:00:17.719296 4724 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Oct 02 13:00:17 crc kubenswrapper[4724]: I1002 13:00:17.721064 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 13:00:17 crc kubenswrapper[4724]: I1002 13:00:17.721105 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 13:00:17 crc kubenswrapper[4724]: I1002 13:00:17.721113 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 13:00:17 crc kubenswrapper[4724]: I1002 13:00:17.721131 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 13:00:17 crc kubenswrapper[4724]: I1002 13:00:17.721143 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T13:00:17Z","lastTransitionTime":"2025-10-02T13:00:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 13:00:17 crc kubenswrapper[4724]: I1002 13:00:17.824907 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 13:00:17 crc kubenswrapper[4724]: I1002 13:00:17.824972 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 13:00:17 crc kubenswrapper[4724]: I1002 13:00:17.824988 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 13:00:17 crc kubenswrapper[4724]: I1002 13:00:17.825007 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 13:00:17 crc kubenswrapper[4724]: I1002 13:00:17.825027 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T13:00:17Z","lastTransitionTime":"2025-10-02T13:00:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 13:00:17 crc kubenswrapper[4724]: I1002 13:00:17.928297 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 13:00:17 crc kubenswrapper[4724]: I1002 13:00:17.928343 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 13:00:17 crc kubenswrapper[4724]: I1002 13:00:17.928352 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 13:00:17 crc kubenswrapper[4724]: I1002 13:00:17.928367 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 13:00:17 crc kubenswrapper[4724]: I1002 13:00:17.928377 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T13:00:17Z","lastTransitionTime":"2025-10-02T13:00:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 13:00:18 crc kubenswrapper[4724]: I1002 13:00:18.031393 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 13:00:18 crc kubenswrapper[4724]: I1002 13:00:18.031446 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 13:00:18 crc kubenswrapper[4724]: I1002 13:00:18.031456 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 13:00:18 crc kubenswrapper[4724]: I1002 13:00:18.031479 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 13:00:18 crc kubenswrapper[4724]: I1002 13:00:18.031492 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T13:00:18Z","lastTransitionTime":"2025-10-02T13:00:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 13:00:18 crc kubenswrapper[4724]: I1002 13:00:18.133604 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 13:00:18 crc kubenswrapper[4724]: I1002 13:00:18.133662 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 13:00:18 crc kubenswrapper[4724]: I1002 13:00:18.133672 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 13:00:18 crc kubenswrapper[4724]: I1002 13:00:18.133690 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 13:00:18 crc kubenswrapper[4724]: I1002 13:00:18.133699 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T13:00:18Z","lastTransitionTime":"2025-10-02T13:00:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 13:00:18 crc kubenswrapper[4724]: I1002 13:00:18.236950 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 13:00:18 crc kubenswrapper[4724]: I1002 13:00:18.236993 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 13:00:18 crc kubenswrapper[4724]: I1002 13:00:18.237003 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 13:00:18 crc kubenswrapper[4724]: I1002 13:00:18.237019 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 13:00:18 crc kubenswrapper[4724]: I1002 13:00:18.237030 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T13:00:18Z","lastTransitionTime":"2025-10-02T13:00:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 13:00:18 crc kubenswrapper[4724]: I1002 13:00:18.340583 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 13:00:18 crc kubenswrapper[4724]: I1002 13:00:18.340629 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 13:00:18 crc kubenswrapper[4724]: I1002 13:00:18.340641 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 13:00:18 crc kubenswrapper[4724]: I1002 13:00:18.340660 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 13:00:18 crc kubenswrapper[4724]: I1002 13:00:18.340673 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T13:00:18Z","lastTransitionTime":"2025-10-02T13:00:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 13:00:18 crc kubenswrapper[4724]: I1002 13:00:18.443959 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 13:00:18 crc kubenswrapper[4724]: I1002 13:00:18.444017 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 13:00:18 crc kubenswrapper[4724]: I1002 13:00:18.444032 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 13:00:18 crc kubenswrapper[4724]: I1002 13:00:18.444052 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 13:00:18 crc kubenswrapper[4724]: I1002 13:00:18.444069 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T13:00:18Z","lastTransitionTime":"2025-10-02T13:00:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 13:00:18 crc kubenswrapper[4724]: I1002 13:00:18.547344 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 13:00:18 crc kubenswrapper[4724]: I1002 13:00:18.547405 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 13:00:18 crc kubenswrapper[4724]: I1002 13:00:18.547415 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 13:00:18 crc kubenswrapper[4724]: I1002 13:00:18.547435 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 13:00:18 crc kubenswrapper[4724]: I1002 13:00:18.547448 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T13:00:18Z","lastTransitionTime":"2025-10-02T13:00:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 13:00:18 crc kubenswrapper[4724]: I1002 13:00:18.650364 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 13:00:18 crc kubenswrapper[4724]: I1002 13:00:18.650396 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 13:00:18 crc kubenswrapper[4724]: I1002 13:00:18.650405 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 13:00:18 crc kubenswrapper[4724]: I1002 13:00:18.650419 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 13:00:18 crc kubenswrapper[4724]: I1002 13:00:18.650430 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T13:00:18Z","lastTransitionTime":"2025-10-02T13:00:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 13:00:18 crc kubenswrapper[4724]: I1002 13:00:18.753061 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 13:00:18 crc kubenswrapper[4724]: I1002 13:00:18.753117 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 13:00:18 crc kubenswrapper[4724]: I1002 13:00:18.753128 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 13:00:18 crc kubenswrapper[4724]: I1002 13:00:18.753146 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 13:00:18 crc kubenswrapper[4724]: I1002 13:00:18.753157 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T13:00:18Z","lastTransitionTime":"2025-10-02T13:00:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 13:00:18 crc kubenswrapper[4724]: I1002 13:00:18.855483 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 13:00:18 crc kubenswrapper[4724]: I1002 13:00:18.855548 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 13:00:18 crc kubenswrapper[4724]: I1002 13:00:18.855563 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 13:00:18 crc kubenswrapper[4724]: I1002 13:00:18.855581 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 13:00:18 crc kubenswrapper[4724]: I1002 13:00:18.855593 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T13:00:18Z","lastTransitionTime":"2025-10-02T13:00:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 13:00:18 crc kubenswrapper[4724]: I1002 13:00:18.958529 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 13:00:18 crc kubenswrapper[4724]: I1002 13:00:18.958621 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 13:00:18 crc kubenswrapper[4724]: I1002 13:00:18.958634 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 13:00:18 crc kubenswrapper[4724]: I1002 13:00:18.958653 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 13:00:18 crc kubenswrapper[4724]: I1002 13:00:18.958668 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T13:00:18Z","lastTransitionTime":"2025-10-02T13:00:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 13:00:19 crc kubenswrapper[4724]: I1002 13:00:19.061083 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 13:00:19 crc kubenswrapper[4724]: I1002 13:00:19.061150 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 13:00:19 crc kubenswrapper[4724]: I1002 13:00:19.061165 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 13:00:19 crc kubenswrapper[4724]: I1002 13:00:19.061189 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 13:00:19 crc kubenswrapper[4724]: I1002 13:00:19.061201 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T13:00:19Z","lastTransitionTime":"2025-10-02T13:00:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 13:00:19 crc kubenswrapper[4724]: I1002 13:00:19.164411 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 13:00:19 crc kubenswrapper[4724]: I1002 13:00:19.164457 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 13:00:19 crc kubenswrapper[4724]: I1002 13:00:19.164468 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 13:00:19 crc kubenswrapper[4724]: I1002 13:00:19.164488 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 13:00:19 crc kubenswrapper[4724]: I1002 13:00:19.164502 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T13:00:19Z","lastTransitionTime":"2025-10-02T13:00:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 13:00:19 crc kubenswrapper[4724]: I1002 13:00:19.267384 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 13:00:19 crc kubenswrapper[4724]: I1002 13:00:19.267432 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 13:00:19 crc kubenswrapper[4724]: I1002 13:00:19.267441 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 13:00:19 crc kubenswrapper[4724]: I1002 13:00:19.267458 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 13:00:19 crc kubenswrapper[4724]: I1002 13:00:19.267468 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T13:00:19Z","lastTransitionTime":"2025-10-02T13:00:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 13:00:19 crc kubenswrapper[4724]: I1002 13:00:19.313071 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 13:00:19 crc kubenswrapper[4724]: I1002 13:00:19.313109 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 13:00:19 crc kubenswrapper[4724]: I1002 13:00:19.313112 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-q7t2t" Oct 02 13:00:19 crc kubenswrapper[4724]: I1002 13:00:19.313142 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 13:00:19 crc kubenswrapper[4724]: E1002 13:00:19.313234 4724 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 02 13:00:19 crc kubenswrapper[4724]: E1002 13:00:19.313355 4724 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-q7t2t" podUID="32e04071-6b34-4fc0-9783-f346a72fcf99" Oct 02 13:00:19 crc kubenswrapper[4724]: E1002 13:00:19.313408 4724 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 02 13:00:19 crc kubenswrapper[4724]: E1002 13:00:19.313483 4724 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 02 13:00:19 crc kubenswrapper[4724]: I1002 13:00:19.370392 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 13:00:19 crc kubenswrapper[4724]: I1002 13:00:19.370460 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 13:00:19 crc kubenswrapper[4724]: I1002 13:00:19.370475 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 13:00:19 crc kubenswrapper[4724]: I1002 13:00:19.370495 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 13:00:19 crc kubenswrapper[4724]: I1002 13:00:19.370509 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T13:00:19Z","lastTransitionTime":"2025-10-02T13:00:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 13:00:19 crc kubenswrapper[4724]: I1002 13:00:19.474035 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 13:00:19 crc kubenswrapper[4724]: I1002 13:00:19.474089 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 13:00:19 crc kubenswrapper[4724]: I1002 13:00:19.474099 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 13:00:19 crc kubenswrapper[4724]: I1002 13:00:19.474129 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 13:00:19 crc kubenswrapper[4724]: I1002 13:00:19.474140 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T13:00:19Z","lastTransitionTime":"2025-10-02T13:00:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 13:00:19 crc kubenswrapper[4724]: I1002 13:00:19.576765 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 13:00:19 crc kubenswrapper[4724]: I1002 13:00:19.576811 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 13:00:19 crc kubenswrapper[4724]: I1002 13:00:19.576822 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 13:00:19 crc kubenswrapper[4724]: I1002 13:00:19.576837 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 13:00:19 crc kubenswrapper[4724]: I1002 13:00:19.576848 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T13:00:19Z","lastTransitionTime":"2025-10-02T13:00:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 13:00:19 crc kubenswrapper[4724]: I1002 13:00:19.679630 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 13:00:19 crc kubenswrapper[4724]: I1002 13:00:19.679977 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 13:00:19 crc kubenswrapper[4724]: I1002 13:00:19.680075 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 13:00:19 crc kubenswrapper[4724]: I1002 13:00:19.680183 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 13:00:19 crc kubenswrapper[4724]: I1002 13:00:19.680283 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T13:00:19Z","lastTransitionTime":"2025-10-02T13:00:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 13:00:19 crc kubenswrapper[4724]: I1002 13:00:19.783091 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 13:00:19 crc kubenswrapper[4724]: I1002 13:00:19.783467 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 13:00:19 crc kubenswrapper[4724]: I1002 13:00:19.783631 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 13:00:19 crc kubenswrapper[4724]: I1002 13:00:19.783779 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 13:00:19 crc kubenswrapper[4724]: I1002 13:00:19.783890 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T13:00:19Z","lastTransitionTime":"2025-10-02T13:00:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 13:00:19 crc kubenswrapper[4724]: I1002 13:00:19.887486 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 13:00:19 crc kubenswrapper[4724]: I1002 13:00:19.887946 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 13:00:19 crc kubenswrapper[4724]: I1002 13:00:19.888038 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 13:00:19 crc kubenswrapper[4724]: I1002 13:00:19.888146 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 13:00:19 crc kubenswrapper[4724]: I1002 13:00:19.888259 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T13:00:19Z","lastTransitionTime":"2025-10-02T13:00:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 13:00:19 crc kubenswrapper[4724]: I1002 13:00:19.991729 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 13:00:19 crc kubenswrapper[4724]: I1002 13:00:19.991807 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 13:00:19 crc kubenswrapper[4724]: I1002 13:00:19.991827 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 13:00:19 crc kubenswrapper[4724]: I1002 13:00:19.991847 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 13:00:19 crc kubenswrapper[4724]: I1002 13:00:19.991860 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T13:00:19Z","lastTransitionTime":"2025-10-02T13:00:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 13:00:20 crc kubenswrapper[4724]: I1002 13:00:20.095467 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 13:00:20 crc kubenswrapper[4724]: I1002 13:00:20.095519 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 13:00:20 crc kubenswrapper[4724]: I1002 13:00:20.095529 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 13:00:20 crc kubenswrapper[4724]: I1002 13:00:20.095563 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 13:00:20 crc kubenswrapper[4724]: I1002 13:00:20.095576 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T13:00:20Z","lastTransitionTime":"2025-10-02T13:00:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 13:00:20 crc kubenswrapper[4724]: I1002 13:00:20.199104 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 13:00:20 crc kubenswrapper[4724]: I1002 13:00:20.199158 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 13:00:20 crc kubenswrapper[4724]: I1002 13:00:20.199169 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 13:00:20 crc kubenswrapper[4724]: I1002 13:00:20.199193 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 13:00:20 crc kubenswrapper[4724]: I1002 13:00:20.199207 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T13:00:20Z","lastTransitionTime":"2025-10-02T13:00:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 13:00:20 crc kubenswrapper[4724]: I1002 13:00:20.302751 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 13:00:20 crc kubenswrapper[4724]: I1002 13:00:20.302821 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 13:00:20 crc kubenswrapper[4724]: I1002 13:00:20.302833 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 13:00:20 crc kubenswrapper[4724]: I1002 13:00:20.302853 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 13:00:20 crc kubenswrapper[4724]: I1002 13:00:20.302868 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T13:00:20Z","lastTransitionTime":"2025-10-02T13:00:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 13:00:20 crc kubenswrapper[4724]: I1002 13:00:20.405830 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 13:00:20 crc kubenswrapper[4724]: I1002 13:00:20.405869 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 13:00:20 crc kubenswrapper[4724]: I1002 13:00:20.405877 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 13:00:20 crc kubenswrapper[4724]: I1002 13:00:20.405894 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 13:00:20 crc kubenswrapper[4724]: I1002 13:00:20.405906 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T13:00:20Z","lastTransitionTime":"2025-10-02T13:00:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 13:00:20 crc kubenswrapper[4724]: I1002 13:00:20.508699 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 13:00:20 crc kubenswrapper[4724]: I1002 13:00:20.508770 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 13:00:20 crc kubenswrapper[4724]: I1002 13:00:20.508782 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 13:00:20 crc kubenswrapper[4724]: I1002 13:00:20.508799 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 13:00:20 crc kubenswrapper[4724]: I1002 13:00:20.508810 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T13:00:20Z","lastTransitionTime":"2025-10-02T13:00:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 13:00:20 crc kubenswrapper[4724]: I1002 13:00:20.610823 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 13:00:20 crc kubenswrapper[4724]: I1002 13:00:20.610912 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 13:00:20 crc kubenswrapper[4724]: I1002 13:00:20.610926 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 13:00:20 crc kubenswrapper[4724]: I1002 13:00:20.610943 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 13:00:20 crc kubenswrapper[4724]: I1002 13:00:20.610958 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T13:00:20Z","lastTransitionTime":"2025-10-02T13:00:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 13:00:20 crc kubenswrapper[4724]: I1002 13:00:20.714183 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 13:00:20 crc kubenswrapper[4724]: I1002 13:00:20.714523 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 13:00:20 crc kubenswrapper[4724]: I1002 13:00:20.714648 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 13:00:20 crc kubenswrapper[4724]: I1002 13:00:20.714874 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 13:00:20 crc kubenswrapper[4724]: I1002 13:00:20.714974 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T13:00:20Z","lastTransitionTime":"2025-10-02T13:00:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 13:00:20 crc kubenswrapper[4724]: I1002 13:00:20.819293 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 13:00:20 crc kubenswrapper[4724]: I1002 13:00:20.819343 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 13:00:20 crc kubenswrapper[4724]: I1002 13:00:20.819353 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 13:00:20 crc kubenswrapper[4724]: I1002 13:00:20.819370 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 13:00:20 crc kubenswrapper[4724]: I1002 13:00:20.819381 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T13:00:20Z","lastTransitionTime":"2025-10-02T13:00:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 13:00:20 crc kubenswrapper[4724]: I1002 13:00:20.922318 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 13:00:20 crc kubenswrapper[4724]: I1002 13:00:20.922354 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 13:00:20 crc kubenswrapper[4724]: I1002 13:00:20.922370 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 13:00:20 crc kubenswrapper[4724]: I1002 13:00:20.922389 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 13:00:20 crc kubenswrapper[4724]: I1002 13:00:20.922400 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T13:00:20Z","lastTransitionTime":"2025-10-02T13:00:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 13:00:21 crc kubenswrapper[4724]: I1002 13:00:21.025707 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 13:00:21 crc kubenswrapper[4724]: I1002 13:00:21.025768 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 13:00:21 crc kubenswrapper[4724]: I1002 13:00:21.025786 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 13:00:21 crc kubenswrapper[4724]: I1002 13:00:21.025807 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 13:00:21 crc kubenswrapper[4724]: I1002 13:00:21.025821 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T13:00:21Z","lastTransitionTime":"2025-10-02T13:00:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 13:00:21 crc kubenswrapper[4724]: I1002 13:00:21.128736 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 13:00:21 crc kubenswrapper[4724]: I1002 13:00:21.128796 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 13:00:21 crc kubenswrapper[4724]: I1002 13:00:21.128807 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 13:00:21 crc kubenswrapper[4724]: I1002 13:00:21.128824 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 13:00:21 crc kubenswrapper[4724]: I1002 13:00:21.128840 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T13:00:21Z","lastTransitionTime":"2025-10-02T13:00:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 13:00:21 crc kubenswrapper[4724]: I1002 13:00:21.234868 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 13:00:21 crc kubenswrapper[4724]: I1002 13:00:21.234920 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 13:00:21 crc kubenswrapper[4724]: I1002 13:00:21.234931 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 13:00:21 crc kubenswrapper[4724]: I1002 13:00:21.234952 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 13:00:21 crc kubenswrapper[4724]: I1002 13:00:21.234965 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T13:00:21Z","lastTransitionTime":"2025-10-02T13:00:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 13:00:21 crc kubenswrapper[4724]: I1002 13:00:21.312617 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 13:00:21 crc kubenswrapper[4724]: I1002 13:00:21.312663 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 13:00:21 crc kubenswrapper[4724]: I1002 13:00:21.312733 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-q7t2t" Oct 02 13:00:21 crc kubenswrapper[4724]: I1002 13:00:21.312770 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 13:00:21 crc kubenswrapper[4724]: E1002 13:00:21.312910 4724 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 02 13:00:21 crc kubenswrapper[4724]: E1002 13:00:21.313024 4724 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 02 13:00:21 crc kubenswrapper[4724]: E1002 13:00:21.313088 4724 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-q7t2t" podUID="32e04071-6b34-4fc0-9783-f346a72fcf99" Oct 02 13:00:21 crc kubenswrapper[4724]: E1002 13:00:21.313185 4724 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 02 13:00:21 crc kubenswrapper[4724]: I1002 13:00:21.338302 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 13:00:21 crc kubenswrapper[4724]: I1002 13:00:21.338377 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 13:00:21 crc kubenswrapper[4724]: I1002 13:00:21.338394 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 13:00:21 crc kubenswrapper[4724]: I1002 13:00:21.338418 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 13:00:21 crc kubenswrapper[4724]: I1002 13:00:21.338434 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T13:00:21Z","lastTransitionTime":"2025-10-02T13:00:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 13:00:21 crc kubenswrapper[4724]: I1002 13:00:21.441196 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 13:00:21 crc kubenswrapper[4724]: I1002 13:00:21.441262 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 13:00:21 crc kubenswrapper[4724]: I1002 13:00:21.441274 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 13:00:21 crc kubenswrapper[4724]: I1002 13:00:21.441306 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 13:00:21 crc kubenswrapper[4724]: I1002 13:00:21.441321 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T13:00:21Z","lastTransitionTime":"2025-10-02T13:00:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 13:00:21 crc kubenswrapper[4724]: I1002 13:00:21.543753 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 13:00:21 crc kubenswrapper[4724]: I1002 13:00:21.543781 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 13:00:21 crc kubenswrapper[4724]: I1002 13:00:21.543788 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 13:00:21 crc kubenswrapper[4724]: I1002 13:00:21.543802 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 13:00:21 crc kubenswrapper[4724]: I1002 13:00:21.543812 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T13:00:21Z","lastTransitionTime":"2025-10-02T13:00:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 13:00:21 crc kubenswrapper[4724]: I1002 13:00:21.646309 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 13:00:21 crc kubenswrapper[4724]: I1002 13:00:21.646675 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 13:00:21 crc kubenswrapper[4724]: I1002 13:00:21.646779 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 13:00:21 crc kubenswrapper[4724]: I1002 13:00:21.646863 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 13:00:21 crc kubenswrapper[4724]: I1002 13:00:21.646945 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T13:00:21Z","lastTransitionTime":"2025-10-02T13:00:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 13:00:21 crc kubenswrapper[4724]: I1002 13:00:21.777440 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 13:00:21 crc kubenswrapper[4724]: I1002 13:00:21.777490 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 13:00:21 crc kubenswrapper[4724]: I1002 13:00:21.777501 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 13:00:21 crc kubenswrapper[4724]: I1002 13:00:21.777518 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 13:00:21 crc kubenswrapper[4724]: I1002 13:00:21.777546 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T13:00:21Z","lastTransitionTime":"2025-10-02T13:00:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 13:00:21 crc kubenswrapper[4724]: I1002 13:00:21.880709 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 13:00:21 crc kubenswrapper[4724]: I1002 13:00:21.881254 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 13:00:21 crc kubenswrapper[4724]: I1002 13:00:21.881340 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 13:00:21 crc kubenswrapper[4724]: I1002 13:00:21.881412 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 13:00:21 crc kubenswrapper[4724]: I1002 13:00:21.881482 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T13:00:21Z","lastTransitionTime":"2025-10-02T13:00:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 13:00:21 crc kubenswrapper[4724]: I1002 13:00:21.984738 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 13:00:21 crc kubenswrapper[4724]: I1002 13:00:21.984812 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 13:00:21 crc kubenswrapper[4724]: I1002 13:00:21.984826 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 13:00:21 crc kubenswrapper[4724]: I1002 13:00:21.984850 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 13:00:21 crc kubenswrapper[4724]: I1002 13:00:21.984865 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T13:00:21Z","lastTransitionTime":"2025-10-02T13:00:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 13:00:22 crc kubenswrapper[4724]: I1002 13:00:22.087855 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 13:00:22 crc kubenswrapper[4724]: I1002 13:00:22.087904 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 13:00:22 crc kubenswrapper[4724]: I1002 13:00:22.087915 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 13:00:22 crc kubenswrapper[4724]: I1002 13:00:22.087939 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 13:00:22 crc kubenswrapper[4724]: I1002 13:00:22.087958 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T13:00:22Z","lastTransitionTime":"2025-10-02T13:00:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 13:00:22 crc kubenswrapper[4724]: I1002 13:00:22.190418 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 13:00:22 crc kubenswrapper[4724]: I1002 13:00:22.191086 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 13:00:22 crc kubenswrapper[4724]: I1002 13:00:22.191153 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 13:00:22 crc kubenswrapper[4724]: I1002 13:00:22.191226 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 13:00:22 crc kubenswrapper[4724]: I1002 13:00:22.191282 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T13:00:22Z","lastTransitionTime":"2025-10-02T13:00:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 13:00:22 crc kubenswrapper[4724]: I1002 13:00:22.295954 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 13:00:22 crc kubenswrapper[4724]: I1002 13:00:22.296000 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 13:00:22 crc kubenswrapper[4724]: I1002 13:00:22.296010 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 13:00:22 crc kubenswrapper[4724]: I1002 13:00:22.296028 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 13:00:22 crc kubenswrapper[4724]: I1002 13:00:22.296042 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T13:00:22Z","lastTransitionTime":"2025-10-02T13:00:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 13:00:22 crc kubenswrapper[4724]: I1002 13:00:22.314696 4724 scope.go:117] "RemoveContainer" containerID="28aaaa47f8700a0b8725c20b81e95579d97b1548baea9e9cea1fec062cd76f79" Oct 02 13:00:22 crc kubenswrapper[4724]: I1002 13:00:22.399573 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 13:00:22 crc kubenswrapper[4724]: I1002 13:00:22.399630 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 13:00:22 crc kubenswrapper[4724]: I1002 13:00:22.399647 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 13:00:22 crc kubenswrapper[4724]: I1002 13:00:22.399666 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 13:00:22 crc kubenswrapper[4724]: I1002 13:00:22.399678 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T13:00:22Z","lastTransitionTime":"2025-10-02T13:00:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 13:00:22 crc kubenswrapper[4724]: I1002 13:00:22.502657 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 13:00:22 crc kubenswrapper[4724]: I1002 13:00:22.502691 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 13:00:22 crc kubenswrapper[4724]: I1002 13:00:22.502699 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 13:00:22 crc kubenswrapper[4724]: I1002 13:00:22.502714 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 13:00:22 crc kubenswrapper[4724]: I1002 13:00:22.502724 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T13:00:22Z","lastTransitionTime":"2025-10-02T13:00:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 13:00:22 crc kubenswrapper[4724]: I1002 13:00:22.605837 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 13:00:22 crc kubenswrapper[4724]: I1002 13:00:22.606186 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 13:00:22 crc kubenswrapper[4724]: I1002 13:00:22.606198 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 13:00:22 crc kubenswrapper[4724]: I1002 13:00:22.606218 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 13:00:22 crc kubenswrapper[4724]: I1002 13:00:22.606231 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T13:00:22Z","lastTransitionTime":"2025-10-02T13:00:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 13:00:22 crc kubenswrapper[4724]: I1002 13:00:22.709109 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 13:00:22 crc kubenswrapper[4724]: I1002 13:00:22.709160 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 13:00:22 crc kubenswrapper[4724]: I1002 13:00:22.709176 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 13:00:22 crc kubenswrapper[4724]: I1002 13:00:22.709193 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 13:00:22 crc kubenswrapper[4724]: I1002 13:00:22.709204 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T13:00:22Z","lastTransitionTime":"2025-10-02T13:00:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 13:00:22 crc kubenswrapper[4724]: I1002 13:00:22.812473 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 13:00:22 crc kubenswrapper[4724]: I1002 13:00:22.812528 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 13:00:22 crc kubenswrapper[4724]: I1002 13:00:22.812556 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 13:00:22 crc kubenswrapper[4724]: I1002 13:00:22.812579 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 13:00:22 crc kubenswrapper[4724]: I1002 13:00:22.812637 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T13:00:22Z","lastTransitionTime":"2025-10-02T13:00:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 13:00:22 crc kubenswrapper[4724]: I1002 13:00:22.862191 4724 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-w58lt_4089ad23-969c-4222-a8ed-e141ec291e80/ovnkube-controller/2.log" Oct 02 13:00:22 crc kubenswrapper[4724]: I1002 13:00:22.868596 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-w58lt" event={"ID":"4089ad23-969c-4222-a8ed-e141ec291e80","Type":"ContainerStarted","Data":"6c6b3b7fc0c4fc12e9d202da41be4c0a934508c22e2f4d81029e5e29ac0073f1"} Oct 02 13:00:22 crc kubenswrapper[4724]: I1002 13:00:22.869205 4724 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-w58lt" Oct 02 13:00:22 crc kubenswrapper[4724]: I1002 13:00:22.885849 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0ad92a13-94bc-463f-bdba-e4e6c7e97e13\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:58:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:58:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:58:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:58:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:58:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f77fba67120d9373972da96b5e1bffe2fbe5e928e877cac05d409feb94638952\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:58:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0a7cab088cdfb74745e054222be93aa9d7639f5d200570d867608b2e04f6a0e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0a7cab088cdfb74745e054222be93aa9d7639f5d200570d867608b2e04f6a0e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T12:58:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T12:58:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T12:58:56Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T13:00:22Z is after 2025-08-24T17:21:41Z" Oct 02 13:00:22 crc kubenswrapper[4724]: I1002 13:00:22.905299 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"92a0e520-2ed8-4d81-8de4-c0f98916b012\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:58:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:58:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:58:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8d0d24fc1f56963b40bca48e3e923a5b35f9bec48f0b4fabbbdc9163762b2aa6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:58:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cdf23aa5cabd12ba0af299a7184df0341b5a7f7fb6f8159a10c932599fcef08b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:58:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a087f1c52a561564ac405ef652e8cb1542077e507acff22e1189f7370b7ca322\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:58:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a675aca2218d07044b6e9a391f2b60f5cc822790011af97b0c0b704c1bd98d78\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:58:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T12:58:56Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T13:00:22Z is after 2025-08-24T17:21:41Z" Oct 02 13:00:22 crc kubenswrapper[4724]: I1002 13:00:22.915398 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 13:00:22 crc kubenswrapper[4724]: I1002 13:00:22.915455 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 13:00:22 crc kubenswrapper[4724]: I1002 13:00:22.915467 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 13:00:22 crc kubenswrapper[4724]: I1002 13:00:22.915485 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 13:00:22 crc kubenswrapper[4724]: I1002 13:00:22.915498 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T13:00:22Z","lastTransitionTime":"2025-10-02T13:00:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 13:00:22 crc kubenswrapper[4724]: I1002 13:00:22.921619 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-2mrjk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e0d209f5-ad55-48f5-b4de-51aa5a972c19\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e94572c741d70917793b2482faf4c7fba57dfc4a29ade8824b35109735b602c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8f48\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T12:59:21Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-2mrjk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T13:00:22Z is after 2025-08-24T17:21:41Z" Oct 02 13:00:22 crc kubenswrapper[4724]: I1002 13:00:22.934598 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-74k4t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6090eaa-c182-4788-950c-16352c271233\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a3396cd890f4ef8af351a4a3065c2324c20bddc7e079bb78e5a02e9fc8269c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fb6lr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4d5dbca156adf53ca2d3b237e15b6dd4387249e73dbf1a7fb3e7e39c53d1b548\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fb6lr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T12:59:21Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-74k4t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T13:00:22Z is after 2025-08-24T17:21:41Z" Oct 02 13:00:22 crc kubenswrapper[4724]: I1002 13:00:22.955586 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-q7t2t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"32e04071-6b34-4fc0-9783-f346a72fcf99\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpmhc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpmhc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T12:59:35Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-q7t2t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T13:00:22Z is after 2025-08-24T17:21:41Z" Oct 02 13:00:22 crc kubenswrapper[4724]: I1002 13:00:22.975242 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7cb9bd1f-850c-4e5f-bcd0-cd9cdca06604\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:58:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:58:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:58:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4c3fabef61b20063aa2de1008b2b1fda94f39ddd4d0ad905d1a954955ffff7d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:58:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d7393e9d92fc40b10b30d788707b77d3c3e28fb075c618da4c3575dc0a1c1a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:58:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7459703a380feaeff882176dc352b69d3e13089a18d5da98bcebd41018d32091\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:58:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a8e76f0ac3479d575a0b073c5b9ad857711fc06bca31e147167babe0fb164614\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a8e76f0ac3479d575a0b073c5b9ad857711fc06bca31e147167babe0fb164614\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T12:58:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T12:58:57Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T12:58:56Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T13:00:22Z is after 2025-08-24T17:21:41Z" Oct 02 13:00:23 crc kubenswrapper[4724]: I1002 13:00:23.006494 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df76441bbfa05afa242a77e10e6f855d811a9432b790630a9fa5f359ccb6d457\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T13:00:23Z is after 2025-08-24T17:21:41Z" Oct 02 13:00:23 crc kubenswrapper[4724]: I1002 13:00:23.018514 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 13:00:23 crc kubenswrapper[4724]: I1002 13:00:23.018582 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 13:00:23 crc kubenswrapper[4724]: I1002 13:00:23.018593 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 13:00:23 crc kubenswrapper[4724]: I1002 13:00:23.018611 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 13:00:23 crc kubenswrapper[4724]: I1002 13:00:23.018622 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T13:00:23Z","lastTransitionTime":"2025-10-02T13:00:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 13:00:23 crc kubenswrapper[4724]: I1002 13:00:23.040213 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://16ca55ea7e67fb6219b1c835d960d1f9ec6a18a98789ee961e44a278fad9add7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://72eefb1c1aad15d7ad34aa4384c7d0a20a4c886225cb7d28f0597cf01ad35693\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T13:00:23Z is after 2025-08-24T17:21:41Z" Oct 02 13:00:23 crc kubenswrapper[4724]: I1002 13:00:23.064802 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:19Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T13:00:23Z is after 2025-08-24T17:21:41Z" Oct 02 13:00:23 crc kubenswrapper[4724]: I1002 13:00:23.085331 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-829dv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b90b9e02-3565-4ad5-8f8c-eec339fc499c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://12d1d4558bef1a0a974bf27f9547c646e216ed5001227ab980e89b05c2a7b5ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vjwcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://55a18158dd67ef4667e4a6bc9684c4f4824f848b6e151a0f99c73e5ecfcc5ab1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://55a18158dd67ef4667e4a6bc9684c4f4824f848b6e151a0f99c73e5ecfcc5ab1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T12:59:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T12:59:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vjwcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://db711940b17fde8f8a00b44de8366688eb730facc06a174bf09456c12c673eb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://db711940b17fde8f8a00b44de8366688eb730facc06a174bf09456c12c673eb0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T12:59:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T12:59:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vjwcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a04ebb8ce267454f3068833092d9644e22484d6d3ea6c89e20fe396e5ea1fe9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a04ebb8ce267454f3068833092d9644e22484d6d3ea6c89e20fe396e5ea1fe9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T12:59:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T12:59:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vjwcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://771dcee276aac3571cff4769668348ea4a10d51e792d1604a47c57edac172dec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://771dcee276aac3571cff4769668348ea4a10d51e792d1604a47c57edac172dec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T12:59:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T12:59:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vjwcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c8b3f0e84cfea43942d32d84b09db7348002648527ea7d65a716489f8507eeb5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c8b3f0e84cfea43942d32d84b09db7348002648527ea7d65a716489f8507eeb5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T12:59:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T12:59:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vjwcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1271c0f5d096f2fc134e178807b44e3a5980ca67d64f0e0d73f566b348206a55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1271c0f5d096f2fc134e178807b44e3a5980ca67d64f0e0d73f566b348206a55\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T12:59:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T12:59:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vjwcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T12:59:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-829dv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T13:00:23Z is after 2025-08-24T17:21:41Z" Oct 02 13:00:23 crc kubenswrapper[4724]: I1002 13:00:23.089897 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 13:00:23 crc kubenswrapper[4724]: I1002 13:00:23.090141 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 13:00:23 crc kubenswrapper[4724]: E1002 13:00:23.090253 4724 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Oct 02 13:00:23 crc kubenswrapper[4724]: E1002 13:00:23.090318 4724 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-02 13:01:27.090300637 +0000 UTC m=+151.545059758 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Oct 02 13:00:23 crc kubenswrapper[4724]: E1002 13:00:23.090575 4724 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 13:01:27.090508652 +0000 UTC m=+151.545267903 (durationBeforeRetry 1m4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 13:00:23 crc kubenswrapper[4724]: I1002 13:00:23.103179 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1527f3c8-7fa1-404b-aafd-7cac512df49d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:58:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:58:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:58:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f058a4a1cf35b7f4800f872139322f2b0096d67548945ba03ecfe6775ffd5103\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:58:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c06801afff9aa03ae9e0bcd8a966f454d81fdfe876806eed22e9d93132989dae\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:58:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff4c45bdd946a4d50c7dc93752d3d1a71b10b830611b64e138a5e1c0eb9e5e30\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:58:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://70f8bb8f5d842643df282ff4c021f3aeba0de7dddc0864e3c1deaf64e604fccc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a3585f00b9d5a19821986d13b0d46f0db3bdd6eef3b4e28f3a63449e0aa0e55\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-02T12:59:19Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1002 12:59:13.975124 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1002 12:59:13.976423 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1894517521/tls.crt::/tmp/serving-cert-1894517521/tls.key\\\\\\\"\\\\nI1002 12:59:19.332110 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1002 12:59:19.335107 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1002 12:59:19.335132 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1002 12:59:19.335161 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1002 12:59:19.335167 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1002 12:59:19.339404 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1002 12:59:19.339433 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 12:59:19.339437 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 12:59:19.339442 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1002 12:59:19.339446 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1002 12:59:19.339452 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1002 12:59:19.339454 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1002 12:59:19.339646 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1002 12:59:19.342508 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T12:59:02Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://25e74bd1660931f7bb8dc824f985c539abcbd4c706a11edf7192876b8796e751\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:58:59Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://366ed0b9a29ba0b7606888087909b51b39abd60dd6b11e6509b8613913461b6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://366ed0b9a29ba0b7606888087909b51b39abd60dd6b11e6509b8613913461b6b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T12:58:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T12:58:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T12:58:56Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T13:00:23Z is after 2025-08-24T17:21:41Z" Oct 02 13:00:23 crc kubenswrapper[4724]: I1002 13:00:23.121255 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 13:00:23 crc kubenswrapper[4724]: I1002 13:00:23.121715 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 13:00:23 crc kubenswrapper[4724]: I1002 13:00:23.121843 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 13:00:23 crc kubenswrapper[4724]: I1002 13:00:23.121965 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 13:00:23 crc kubenswrapper[4724]: I1002 13:00:23.122082 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T13:00:23Z","lastTransitionTime":"2025-10-02T13:00:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 13:00:23 crc kubenswrapper[4724]: I1002 13:00:23.123506 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-w58lt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4089ad23-969c-4222-a8ed-e141ec291e80\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://197d101ac60dc4e22bd91f0bccaf0c5d9461d2bb94425b5c6d4ca75aad0f7e25\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sxqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3940d1762b165816101228f10689c4897ca3909b295c51368935e2d28d32e9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sxqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://300ea92580dcf51b685b378397b0dd067899d424f4394bce09a8242415acba64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sxqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6bc33e00e91c1177fe2762cb5295c78f4b00c1861d44446269e2e7668b25ac83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sxqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f4412d5281a9c03ea5b251d17371e363b9c5a970bad0db12adf3f75ddd1f955\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sxqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4186fbd2d282edbe581a4d8d3616dff86c3e6cdbe797999fa57f6034870e7742\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sxqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6c6b3b7fc0c4fc12e9d202da41be4c0a934508c22e2f4d81029e5e29ac0073f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://28aaaa47f8700a0b8725c20b81e95579d97b1548baea9e9cea1fec062cd76f79\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-02T12:59:54Z\\\",\\\"message\\\":\\\"693e1320bd}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1002 12:59:54.044422 6454 obj_retry.go:303] Retry object setup: *v1.Pod openshift-network-console/networking-console-plugin-85b44fc459-gdk6g\\\\nI1002 12:59:54.044420 6454 obj_retry.go:303] Retry object setup: *v1.Pod openshift-network-diagnostics/network-check-source-55646444c4-trplf\\\\nI1002 12:59:54.044433 6454 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-console/networking-console-plugin-85b44fc459-gdk6g\\\\nI1002 12:59:54.044442 6454 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-diagnostics/network-check-source-55646444c4-trplf\\\\nF1002 12:59:54.044455 6454 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T12:59:52Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T13:00:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sxqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c6cd68ea9e76b6dcb1649fafb808953b30389fcaee674b611c655c239a36e7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sxqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9c7841b20007f0436754e669fb6f1a85de1f0ced1c0b2686ae0a20b56ab878f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9c7841b20007f0436754e669fb6f1a85de1f0ced1c0b2686ae0a20b56ab878f2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T12:59:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T12:59:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sxqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T12:59:21Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-w58lt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T13:00:23Z is after 2025-08-24T17:21:41Z" Oct 02 13:00:23 crc kubenswrapper[4724]: I1002 13:00:23.135748 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-vv7gr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"58e6c559-83ff-48ec-b337-ddd00852bc3c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://540350e6cb29bb395f31338d08626d3b110cc3adbe0df070045d178281d9c035\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4spmh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T12:59:26Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-vv7gr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T13:00:23Z is after 2025-08-24T17:21:41Z" Oct 02 13:00:23 crc kubenswrapper[4724]: I1002 13:00:23.146977 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-z9692" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aeccba6f-b4bb-4dd5-8ab1-798d7a67251a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://116acf2f64bac5bb00b87c2bf09f16dcec811d4effbacb2ad49e4d2656180b54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znd9t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5971fadd3a2831fd568f1a0e53cefad9dbb4a9c5ad719b4024ec27381842f92e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znd9t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T12:59:33Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-z9692\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T13:00:23Z is after 2025-08-24T17:21:41Z" Oct 02 13:00:23 crc kubenswrapper[4724]: I1002 13:00:23.164932 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8902618f9e9502ff68d0658e199abb68025c551774c41022ed9f3f15dcccba47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T13:00:23Z is after 2025-08-24T17:21:41Z" Oct 02 13:00:23 crc kubenswrapper[4724]: I1002 13:00:23.180313 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T13:00:23Z is after 2025-08-24T17:21:41Z" Oct 02 13:00:23 crc kubenswrapper[4724]: I1002 13:00:23.191440 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 13:00:23 crc kubenswrapper[4724]: I1002 13:00:23.191484 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 13:00:23 crc kubenswrapper[4724]: I1002 13:00:23.191522 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 13:00:23 crc kubenswrapper[4724]: E1002 13:00:23.191690 4724 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 02 13:00:23 crc kubenswrapper[4724]: E1002 13:00:23.191709 4724 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 02 13:00:23 crc kubenswrapper[4724]: E1002 13:00:23.191756 4724 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 02 13:00:23 crc kubenswrapper[4724]: E1002 13:00:23.191771 4724 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 02 13:00:23 crc kubenswrapper[4724]: E1002 13:00:23.191786 4724 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 02 13:00:23 crc kubenswrapper[4724]: E1002 13:00:23.191773 4724 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-02 13:01:27.191754266 +0000 UTC m=+151.646513387 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 02 13:00:23 crc kubenswrapper[4724]: E1002 13:00:23.191839 4724 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 02 13:00:23 crc kubenswrapper[4724]: E1002 13:00:23.191860 4724 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 02 13:00:23 crc kubenswrapper[4724]: E1002 13:00:23.191864 4724 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-10-02 13:01:27.191838768 +0000 UTC m=+151.646597889 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 02 13:00:23 crc kubenswrapper[4724]: E1002 13:00:23.191942 4724 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-10-02 13:01:27.19191832 +0000 UTC m=+151.646677441 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 02 13:00:23 crc kubenswrapper[4724]: I1002 13:00:23.195025 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:19Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T13:00:23Z is after 2025-08-24T17:21:41Z" Oct 02 13:00:23 crc kubenswrapper[4724]: I1002 13:00:23.207555 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-pr276" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c67e3c27-01fd-47f5-a7fb-a2ae1e7753d4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T13:00:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T13:00:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6c546eb3faa335d37ee1c08a88b2d409a5ee23ed42ca9fd7104fb0bb76ecf15\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5fe9713cb9b004911140ef4802a6755a4d2a781fde439d7318ace93e4b608cef\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-02T13:00:09Z\\\",\\\"message\\\":\\\"2025-10-02T12:59:23+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_52b6ea56-510d-4c4a-86ca-f2d8b03a329f\\\\n2025-10-02T12:59:23+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_52b6ea56-510d-4c4a-86ca-f2d8b03a329f to /host/opt/cni/bin/\\\\n2025-10-02T12:59:24Z [verbose] multus-daemon started\\\\n2025-10-02T12:59:24Z [verbose] Readiness Indicator file check\\\\n2025-10-02T13:00:09Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T12:59:22Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T13:00:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c844q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T12:59:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-pr276\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T13:00:23Z is after 2025-08-24T17:21:41Z" Oct 02 13:00:23 crc kubenswrapper[4724]: I1002 13:00:23.226033 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 13:00:23 crc kubenswrapper[4724]: I1002 13:00:23.226513 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 13:00:23 crc kubenswrapper[4724]: I1002 13:00:23.226827 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 13:00:23 crc kubenswrapper[4724]: I1002 13:00:23.227112 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 13:00:23 crc kubenswrapper[4724]: I1002 13:00:23.227231 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T13:00:23Z","lastTransitionTime":"2025-10-02T13:00:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 13:00:23 crc kubenswrapper[4724]: I1002 13:00:23.313126 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 13:00:23 crc kubenswrapper[4724]: I1002 13:00:23.313057 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 13:00:23 crc kubenswrapper[4724]: E1002 13:00:23.313722 4724 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 02 13:00:23 crc kubenswrapper[4724]: I1002 13:00:23.313214 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 13:00:23 crc kubenswrapper[4724]: E1002 13:00:23.313801 4724 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 02 13:00:23 crc kubenswrapper[4724]: I1002 13:00:23.313156 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-q7t2t" Oct 02 13:00:23 crc kubenswrapper[4724]: E1002 13:00:23.313890 4724 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-q7t2t" podUID="32e04071-6b34-4fc0-9783-f346a72fcf99" Oct 02 13:00:23 crc kubenswrapper[4724]: E1002 13:00:23.313583 4724 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 02 13:00:23 crc kubenswrapper[4724]: I1002 13:00:23.330247 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 13:00:23 crc kubenswrapper[4724]: I1002 13:00:23.330304 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 13:00:23 crc kubenswrapper[4724]: I1002 13:00:23.330315 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 13:00:23 crc kubenswrapper[4724]: I1002 13:00:23.330334 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 13:00:23 crc kubenswrapper[4724]: I1002 13:00:23.330347 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T13:00:23Z","lastTransitionTime":"2025-10-02T13:00:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 13:00:23 crc kubenswrapper[4724]: I1002 13:00:23.433296 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 13:00:23 crc kubenswrapper[4724]: I1002 13:00:23.433341 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 13:00:23 crc kubenswrapper[4724]: I1002 13:00:23.433380 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 13:00:23 crc kubenswrapper[4724]: I1002 13:00:23.433398 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 13:00:23 crc kubenswrapper[4724]: I1002 13:00:23.433409 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T13:00:23Z","lastTransitionTime":"2025-10-02T13:00:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 13:00:23 crc kubenswrapper[4724]: I1002 13:00:23.535845 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 13:00:23 crc kubenswrapper[4724]: I1002 13:00:23.535910 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 13:00:23 crc kubenswrapper[4724]: I1002 13:00:23.535920 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 13:00:23 crc kubenswrapper[4724]: I1002 13:00:23.535938 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 13:00:23 crc kubenswrapper[4724]: I1002 13:00:23.535949 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T13:00:23Z","lastTransitionTime":"2025-10-02T13:00:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 13:00:23 crc kubenswrapper[4724]: I1002 13:00:23.639405 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 13:00:23 crc kubenswrapper[4724]: I1002 13:00:23.639448 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 13:00:23 crc kubenswrapper[4724]: I1002 13:00:23.639459 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 13:00:23 crc kubenswrapper[4724]: I1002 13:00:23.639479 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 13:00:23 crc kubenswrapper[4724]: I1002 13:00:23.639492 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T13:00:23Z","lastTransitionTime":"2025-10-02T13:00:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 13:00:23 crc kubenswrapper[4724]: I1002 13:00:23.742829 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 13:00:23 crc kubenswrapper[4724]: I1002 13:00:23.742894 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 13:00:23 crc kubenswrapper[4724]: I1002 13:00:23.742907 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 13:00:23 crc kubenswrapper[4724]: I1002 13:00:23.742927 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 13:00:23 crc kubenswrapper[4724]: I1002 13:00:23.742939 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T13:00:23Z","lastTransitionTime":"2025-10-02T13:00:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 13:00:23 crc kubenswrapper[4724]: I1002 13:00:23.845938 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 13:00:23 crc kubenswrapper[4724]: I1002 13:00:23.846426 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 13:00:23 crc kubenswrapper[4724]: I1002 13:00:23.846564 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 13:00:23 crc kubenswrapper[4724]: I1002 13:00:23.846691 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 13:00:23 crc kubenswrapper[4724]: I1002 13:00:23.846780 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T13:00:23Z","lastTransitionTime":"2025-10-02T13:00:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 13:00:23 crc kubenswrapper[4724]: I1002 13:00:23.874816 4724 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-w58lt_4089ad23-969c-4222-a8ed-e141ec291e80/ovnkube-controller/3.log" Oct 02 13:00:23 crc kubenswrapper[4724]: I1002 13:00:23.876114 4724 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-w58lt_4089ad23-969c-4222-a8ed-e141ec291e80/ovnkube-controller/2.log" Oct 02 13:00:23 crc kubenswrapper[4724]: I1002 13:00:23.879999 4724 generic.go:334] "Generic (PLEG): container finished" podID="4089ad23-969c-4222-a8ed-e141ec291e80" containerID="6c6b3b7fc0c4fc12e9d202da41be4c0a934508c22e2f4d81029e5e29ac0073f1" exitCode=1 Oct 02 13:00:23 crc kubenswrapper[4724]: I1002 13:00:23.880049 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-w58lt" event={"ID":"4089ad23-969c-4222-a8ed-e141ec291e80","Type":"ContainerDied","Data":"6c6b3b7fc0c4fc12e9d202da41be4c0a934508c22e2f4d81029e5e29ac0073f1"} Oct 02 13:00:23 crc kubenswrapper[4724]: I1002 13:00:23.880097 4724 scope.go:117] "RemoveContainer" containerID="28aaaa47f8700a0b8725c20b81e95579d97b1548baea9e9cea1fec062cd76f79" Oct 02 13:00:23 crc kubenswrapper[4724]: I1002 13:00:23.880712 4724 scope.go:117] "RemoveContainer" containerID="6c6b3b7fc0c4fc12e9d202da41be4c0a934508c22e2f4d81029e5e29ac0073f1" Oct 02 13:00:23 crc kubenswrapper[4724]: E1002 13:00:23.880898 4724 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-w58lt_openshift-ovn-kubernetes(4089ad23-969c-4222-a8ed-e141ec291e80)\"" pod="openshift-ovn-kubernetes/ovnkube-node-w58lt" podUID="4089ad23-969c-4222-a8ed-e141ec291e80" Oct 02 13:00:23 crc kubenswrapper[4724]: I1002 13:00:23.896032 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1527f3c8-7fa1-404b-aafd-7cac512df49d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:58:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:58:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:58:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f058a4a1cf35b7f4800f872139322f2b0096d67548945ba03ecfe6775ffd5103\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:58:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c06801afff9aa03ae9e0bcd8a966f454d81fdfe876806eed22e9d93132989dae\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:58:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff4c45bdd946a4d50c7dc93752d3d1a71b10b830611b64e138a5e1c0eb9e5e30\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:58:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://70f8bb8f5d842643df282ff4c021f3aeba0de7dddc0864e3c1deaf64e604fccc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a3585f00b9d5a19821986d13b0d46f0db3bdd6eef3b4e28f3a63449e0aa0e55\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-02T12:59:19Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1002 12:59:13.975124 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1002 12:59:13.976423 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1894517521/tls.crt::/tmp/serving-cert-1894517521/tls.key\\\\\\\"\\\\nI1002 12:59:19.332110 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1002 12:59:19.335107 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1002 12:59:19.335132 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1002 12:59:19.335161 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1002 12:59:19.335167 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1002 12:59:19.339404 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1002 12:59:19.339433 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 12:59:19.339437 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 12:59:19.339442 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1002 12:59:19.339446 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1002 12:59:19.339452 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1002 12:59:19.339454 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1002 12:59:19.339646 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1002 12:59:19.342508 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T12:59:02Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://25e74bd1660931f7bb8dc824f985c539abcbd4c706a11edf7192876b8796e751\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:58:59Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://366ed0b9a29ba0b7606888087909b51b39abd60dd6b11e6509b8613913461b6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://366ed0b9a29ba0b7606888087909b51b39abd60dd6b11e6509b8613913461b6b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T12:58:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T12:58:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T12:58:56Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T13:00:23Z is after 2025-08-24T17:21:41Z" Oct 02 13:00:23 crc kubenswrapper[4724]: I1002 13:00:23.919125 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-w58lt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4089ad23-969c-4222-a8ed-e141ec291e80\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://197d101ac60dc4e22bd91f0bccaf0c5d9461d2bb94425b5c6d4ca75aad0f7e25\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sxqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3940d1762b165816101228f10689c4897ca3909b295c51368935e2d28d32e9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sxqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://300ea92580dcf51b685b378397b0dd067899d424f4394bce09a8242415acba64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sxqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6bc33e00e91c1177fe2762cb5295c78f4b00c1861d44446269e2e7668b25ac83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sxqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f4412d5281a9c03ea5b251d17371e363b9c5a970bad0db12adf3f75ddd1f955\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sxqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4186fbd2d282edbe581a4d8d3616dff86c3e6cdbe797999fa57f6034870e7742\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sxqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6c6b3b7fc0c4fc12e9d202da41be4c0a934508c22e2f4d81029e5e29ac0073f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://28aaaa47f8700a0b8725c20b81e95579d97b1548baea9e9cea1fec062cd76f79\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-02T12:59:54Z\\\",\\\"message\\\":\\\"693e1320bd}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1002 12:59:54.044422 6454 obj_retry.go:303] Retry object setup: *v1.Pod openshift-network-console/networking-console-plugin-85b44fc459-gdk6g\\\\nI1002 12:59:54.044420 6454 obj_retry.go:303] Retry object setup: *v1.Pod openshift-network-diagnostics/network-check-source-55646444c4-trplf\\\\nI1002 12:59:54.044433 6454 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-console/networking-console-plugin-85b44fc459-gdk6g\\\\nI1002 12:59:54.044442 6454 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-diagnostics/network-check-source-55646444c4-trplf\\\\nF1002 12:59:54.044455 6454 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T12:59:52Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6c6b3b7fc0c4fc12e9d202da41be4c0a934508c22e2f4d81029e5e29ac0073f1\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-02T13:00:23Z\\\",\\\"message\\\":\\\"eject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.5.189\\\\\\\", Port:50051, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI1002 13:00:23.497194 6873 services_controller.go:453] Built service openshift-marketplace/marketplace-operator-metrics template LB for network=default: []services.LB{}\\\\nF1002 13:00:23.497314 6873 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T13:00:23Z\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T13:00:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sxqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c6cd68ea9e76b6dcb1649fafb808953b30389fcaee674b611c655c239a36e7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sxqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9c7841b20007f0436754e669fb6f1a85de1f0ced1c0b2686ae0a20b56ab878f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9c7841b20007f0436754e669fb6f1a85de1f0ced1c0b2686ae0a20b56ab878f2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T12:59:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T12:59:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sxqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T12:59:21Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-w58lt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T13:00:23Z is after 2025-08-24T17:21:41Z" Oct 02 13:00:23 crc kubenswrapper[4724]: I1002 13:00:23.936078 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-pr276" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c67e3c27-01fd-47f5-a7fb-a2ae1e7753d4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T13:00:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T13:00:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6c546eb3faa335d37ee1c08a88b2d409a5ee23ed42ca9fd7104fb0bb76ecf15\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5fe9713cb9b004911140ef4802a6755a4d2a781fde439d7318ace93e4b608cef\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-02T13:00:09Z\\\",\\\"message\\\":\\\"2025-10-02T12:59:23+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_52b6ea56-510d-4c4a-86ca-f2d8b03a329f\\\\n2025-10-02T12:59:23+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_52b6ea56-510d-4c4a-86ca-f2d8b03a329f to /host/opt/cni/bin/\\\\n2025-10-02T12:59:24Z [verbose] multus-daemon started\\\\n2025-10-02T12:59:24Z [verbose] Readiness Indicator file check\\\\n2025-10-02T13:00:09Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T12:59:22Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T13:00:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c844q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T12:59:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-pr276\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T13:00:23Z is after 2025-08-24T17:21:41Z" Oct 02 13:00:23 crc kubenswrapper[4724]: I1002 13:00:23.947772 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-vv7gr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"58e6c559-83ff-48ec-b337-ddd00852bc3c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://540350e6cb29bb395f31338d08626d3b110cc3adbe0df070045d178281d9c035\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4spmh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T12:59:26Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-vv7gr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T13:00:23Z is after 2025-08-24T17:21:41Z" Oct 02 13:00:23 crc kubenswrapper[4724]: I1002 13:00:23.949988 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 13:00:23 crc kubenswrapper[4724]: I1002 13:00:23.950043 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 13:00:23 crc kubenswrapper[4724]: I1002 13:00:23.950063 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 13:00:23 crc kubenswrapper[4724]: I1002 13:00:23.950089 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 13:00:23 crc kubenswrapper[4724]: I1002 13:00:23.950105 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T13:00:23Z","lastTransitionTime":"2025-10-02T13:00:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 13:00:23 crc kubenswrapper[4724]: I1002 13:00:23.964043 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-z9692" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aeccba6f-b4bb-4dd5-8ab1-798d7a67251a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://116acf2f64bac5bb00b87c2bf09f16dcec811d4effbacb2ad49e4d2656180b54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znd9t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5971fadd3a2831fd568f1a0e53cefad9dbb4a9c5ad719b4024ec27381842f92e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znd9t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T12:59:33Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-z9692\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T13:00:23Z is after 2025-08-24T17:21:41Z" Oct 02 13:00:23 crc kubenswrapper[4724]: I1002 13:00:23.980621 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8902618f9e9502ff68d0658e199abb68025c551774c41022ed9f3f15dcccba47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T13:00:23Z is after 2025-08-24T17:21:41Z" Oct 02 13:00:23 crc kubenswrapper[4724]: I1002 13:00:23.997512 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T13:00:23Z is after 2025-08-24T17:21:41Z" Oct 02 13:00:24 crc kubenswrapper[4724]: I1002 13:00:24.010061 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:19Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T13:00:24Z is after 2025-08-24T17:21:41Z" Oct 02 13:00:24 crc kubenswrapper[4724]: I1002 13:00:24.020172 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0ad92a13-94bc-463f-bdba-e4e6c7e97e13\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:58:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:58:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:58:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:58:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:58:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f77fba67120d9373972da96b5e1bffe2fbe5e928e877cac05d409feb94638952\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:58:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0a7cab088cdfb74745e054222be93aa9d7639f5d200570d867608b2e04f6a0e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0a7cab088cdfb74745e054222be93aa9d7639f5d200570d867608b2e04f6a0e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T12:58:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T12:58:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T12:58:56Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T13:00:24Z is after 2025-08-24T17:21:41Z" Oct 02 13:00:24 crc kubenswrapper[4724]: I1002 13:00:24.032016 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"92a0e520-2ed8-4d81-8de4-c0f98916b012\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:58:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:58:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:58:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8d0d24fc1f56963b40bca48e3e923a5b35f9bec48f0b4fabbbdc9163762b2aa6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:58:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cdf23aa5cabd12ba0af299a7184df0341b5a7f7fb6f8159a10c932599fcef08b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:58:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a087f1c52a561564ac405ef652e8cb1542077e507acff22e1189f7370b7ca322\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:58:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a675aca2218d07044b6e9a391f2b60f5cc822790011af97b0c0b704c1bd98d78\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:58:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T12:58:56Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T13:00:24Z is after 2025-08-24T17:21:41Z" Oct 02 13:00:24 crc kubenswrapper[4724]: I1002 13:00:24.041316 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-2mrjk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e0d209f5-ad55-48f5-b4de-51aa5a972c19\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e94572c741d70917793b2482faf4c7fba57dfc4a29ade8824b35109735b602c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8f48\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T12:59:21Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-2mrjk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T13:00:24Z is after 2025-08-24T17:21:41Z" Oct 02 13:00:24 crc kubenswrapper[4724]: I1002 13:00:24.052628 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 13:00:24 crc kubenswrapper[4724]: I1002 13:00:24.052664 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 13:00:24 crc kubenswrapper[4724]: I1002 13:00:24.052676 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 13:00:24 crc kubenswrapper[4724]: I1002 13:00:24.052694 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 13:00:24 crc kubenswrapper[4724]: I1002 13:00:24.052706 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T13:00:24Z","lastTransitionTime":"2025-10-02T13:00:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 13:00:24 crc kubenswrapper[4724]: I1002 13:00:24.054092 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-829dv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b90b9e02-3565-4ad5-8f8c-eec339fc499c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://12d1d4558bef1a0a974bf27f9547c646e216ed5001227ab980e89b05c2a7b5ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vjwcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://55a18158dd67ef4667e4a6bc9684c4f4824f848b6e151a0f99c73e5ecfcc5ab1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://55a18158dd67ef4667e4a6bc9684c4f4824f848b6e151a0f99c73e5ecfcc5ab1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T12:59:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T12:59:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vjwcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://db711940b17fde8f8a00b44de8366688eb730facc06a174bf09456c12c673eb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://db711940b17fde8f8a00b44de8366688eb730facc06a174bf09456c12c673eb0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T12:59:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T12:59:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vjwcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a04ebb8ce267454f3068833092d9644e22484d6d3ea6c89e20fe396e5ea1fe9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a04ebb8ce267454f3068833092d9644e22484d6d3ea6c89e20fe396e5ea1fe9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T12:59:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T12:59:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vjwcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://771dcee276aac3571cff4769668348ea4a10d51e792d1604a47c57edac172dec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://771dcee276aac3571cff4769668348ea4a10d51e792d1604a47c57edac172dec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T12:59:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T12:59:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vjwcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c8b3f0e84cfea43942d32d84b09db7348002648527ea7d65a716489f8507eeb5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c8b3f0e84cfea43942d32d84b09db7348002648527ea7d65a716489f8507eeb5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T12:59:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T12:59:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vjwcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1271c0f5d096f2fc134e178807b44e3a5980ca67d64f0e0d73f566b348206a55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1271c0f5d096f2fc134e178807b44e3a5980ca67d64f0e0d73f566b348206a55\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T12:59:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T12:59:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vjwcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T12:59:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-829dv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T13:00:24Z is after 2025-08-24T17:21:41Z" Oct 02 13:00:24 crc kubenswrapper[4724]: I1002 13:00:24.068866 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-74k4t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6090eaa-c182-4788-950c-16352c271233\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a3396cd890f4ef8af351a4a3065c2324c20bddc7e079bb78e5a02e9fc8269c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fb6lr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4d5dbca156adf53ca2d3b237e15b6dd4387249e73dbf1a7fb3e7e39c53d1b548\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fb6lr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T12:59:21Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-74k4t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T13:00:24Z is after 2025-08-24T17:21:41Z" Oct 02 13:00:24 crc kubenswrapper[4724]: I1002 13:00:24.078076 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-q7t2t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"32e04071-6b34-4fc0-9783-f346a72fcf99\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpmhc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpmhc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T12:59:35Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-q7t2t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T13:00:24Z is after 2025-08-24T17:21:41Z" Oct 02 13:00:24 crc kubenswrapper[4724]: I1002 13:00:24.088888 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7cb9bd1f-850c-4e5f-bcd0-cd9cdca06604\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:58:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:58:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:58:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4c3fabef61b20063aa2de1008b2b1fda94f39ddd4d0ad905d1a954955ffff7d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:58:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d7393e9d92fc40b10b30d788707b77d3c3e28fb075c618da4c3575dc0a1c1a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:58:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7459703a380feaeff882176dc352b69d3e13089a18d5da98bcebd41018d32091\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:58:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a8e76f0ac3479d575a0b073c5b9ad857711fc06bca31e147167babe0fb164614\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a8e76f0ac3479d575a0b073c5b9ad857711fc06bca31e147167babe0fb164614\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T12:58:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T12:58:57Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T12:58:56Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T13:00:24Z is after 2025-08-24T17:21:41Z" Oct 02 13:00:24 crc kubenswrapper[4724]: I1002 13:00:24.102219 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df76441bbfa05afa242a77e10e6f855d811a9432b790630a9fa5f359ccb6d457\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T13:00:24Z is after 2025-08-24T17:21:41Z" Oct 02 13:00:24 crc kubenswrapper[4724]: I1002 13:00:24.117589 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://16ca55ea7e67fb6219b1c835d960d1f9ec6a18a98789ee961e44a278fad9add7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://72eefb1c1aad15d7ad34aa4384c7d0a20a4c886225cb7d28f0597cf01ad35693\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T13:00:24Z is after 2025-08-24T17:21:41Z" Oct 02 13:00:24 crc kubenswrapper[4724]: I1002 13:00:24.132070 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:19Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T13:00:24Z is after 2025-08-24T17:21:41Z" Oct 02 13:00:24 crc kubenswrapper[4724]: I1002 13:00:24.155424 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 13:00:24 crc kubenswrapper[4724]: I1002 13:00:24.155596 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 13:00:24 crc kubenswrapper[4724]: I1002 13:00:24.155615 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 13:00:24 crc kubenswrapper[4724]: I1002 13:00:24.155636 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 13:00:24 crc kubenswrapper[4724]: I1002 13:00:24.155676 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T13:00:24Z","lastTransitionTime":"2025-10-02T13:00:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 13:00:24 crc kubenswrapper[4724]: I1002 13:00:24.258466 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 13:00:24 crc kubenswrapper[4724]: I1002 13:00:24.258514 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 13:00:24 crc kubenswrapper[4724]: I1002 13:00:24.258524 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 13:00:24 crc kubenswrapper[4724]: I1002 13:00:24.258574 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 13:00:24 crc kubenswrapper[4724]: I1002 13:00:24.258587 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T13:00:24Z","lastTransitionTime":"2025-10-02T13:00:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 13:00:24 crc kubenswrapper[4724]: I1002 13:00:24.361515 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 13:00:24 crc kubenswrapper[4724]: I1002 13:00:24.361851 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 13:00:24 crc kubenswrapper[4724]: I1002 13:00:24.361945 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 13:00:24 crc kubenswrapper[4724]: I1002 13:00:24.362021 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 13:00:24 crc kubenswrapper[4724]: I1002 13:00:24.362085 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T13:00:24Z","lastTransitionTime":"2025-10-02T13:00:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 13:00:24 crc kubenswrapper[4724]: I1002 13:00:24.465253 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 13:00:24 crc kubenswrapper[4724]: I1002 13:00:24.465499 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 13:00:24 crc kubenswrapper[4724]: I1002 13:00:24.465520 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 13:00:24 crc kubenswrapper[4724]: I1002 13:00:24.465580 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 13:00:24 crc kubenswrapper[4724]: I1002 13:00:24.465597 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T13:00:24Z","lastTransitionTime":"2025-10-02T13:00:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 13:00:24 crc kubenswrapper[4724]: I1002 13:00:24.569241 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 13:00:24 crc kubenswrapper[4724]: I1002 13:00:24.569288 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 13:00:24 crc kubenswrapper[4724]: I1002 13:00:24.569299 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 13:00:24 crc kubenswrapper[4724]: I1002 13:00:24.569318 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 13:00:24 crc kubenswrapper[4724]: I1002 13:00:24.569336 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T13:00:24Z","lastTransitionTime":"2025-10-02T13:00:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 13:00:24 crc kubenswrapper[4724]: I1002 13:00:24.671774 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 13:00:24 crc kubenswrapper[4724]: I1002 13:00:24.671811 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 13:00:24 crc kubenswrapper[4724]: I1002 13:00:24.671820 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 13:00:24 crc kubenswrapper[4724]: I1002 13:00:24.671836 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 13:00:24 crc kubenswrapper[4724]: I1002 13:00:24.671849 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T13:00:24Z","lastTransitionTime":"2025-10-02T13:00:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 13:00:24 crc kubenswrapper[4724]: I1002 13:00:24.774942 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 13:00:24 crc kubenswrapper[4724]: I1002 13:00:24.774988 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 13:00:24 crc kubenswrapper[4724]: I1002 13:00:24.774996 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 13:00:24 crc kubenswrapper[4724]: I1002 13:00:24.775011 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 13:00:24 crc kubenswrapper[4724]: I1002 13:00:24.775021 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T13:00:24Z","lastTransitionTime":"2025-10-02T13:00:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 13:00:24 crc kubenswrapper[4724]: I1002 13:00:24.877758 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 13:00:24 crc kubenswrapper[4724]: I1002 13:00:24.877825 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 13:00:24 crc kubenswrapper[4724]: I1002 13:00:24.877845 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 13:00:24 crc kubenswrapper[4724]: I1002 13:00:24.877867 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 13:00:24 crc kubenswrapper[4724]: I1002 13:00:24.877882 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T13:00:24Z","lastTransitionTime":"2025-10-02T13:00:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 13:00:24 crc kubenswrapper[4724]: I1002 13:00:24.886878 4724 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-w58lt_4089ad23-969c-4222-a8ed-e141ec291e80/ovnkube-controller/3.log" Oct 02 13:00:24 crc kubenswrapper[4724]: I1002 13:00:24.891408 4724 scope.go:117] "RemoveContainer" containerID="6c6b3b7fc0c4fc12e9d202da41be4c0a934508c22e2f4d81029e5e29ac0073f1" Oct 02 13:00:24 crc kubenswrapper[4724]: E1002 13:00:24.891642 4724 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-w58lt_openshift-ovn-kubernetes(4089ad23-969c-4222-a8ed-e141ec291e80)\"" pod="openshift-ovn-kubernetes/ovnkube-node-w58lt" podUID="4089ad23-969c-4222-a8ed-e141ec291e80" Oct 02 13:00:24 crc kubenswrapper[4724]: I1002 13:00:24.907487 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-pr276" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c67e3c27-01fd-47f5-a7fb-a2ae1e7753d4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T13:00:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T13:00:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6c546eb3faa335d37ee1c08a88b2d409a5ee23ed42ca9fd7104fb0bb76ecf15\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5fe9713cb9b004911140ef4802a6755a4d2a781fde439d7318ace93e4b608cef\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-02T13:00:09Z\\\",\\\"message\\\":\\\"2025-10-02T12:59:23+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_52b6ea56-510d-4c4a-86ca-f2d8b03a329f\\\\n2025-10-02T12:59:23+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_52b6ea56-510d-4c4a-86ca-f2d8b03a329f to /host/opt/cni/bin/\\\\n2025-10-02T12:59:24Z [verbose] multus-daemon started\\\\n2025-10-02T12:59:24Z [verbose] Readiness Indicator file check\\\\n2025-10-02T13:00:09Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T12:59:22Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T13:00:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c844q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T12:59:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-pr276\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T13:00:24Z is after 2025-08-24T17:21:41Z" Oct 02 13:00:24 crc kubenswrapper[4724]: I1002 13:00:24.923133 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-vv7gr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"58e6c559-83ff-48ec-b337-ddd00852bc3c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://540350e6cb29bb395f31338d08626d3b110cc3adbe0df070045d178281d9c035\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4spmh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T12:59:26Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-vv7gr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T13:00:24Z is after 2025-08-24T17:21:41Z" Oct 02 13:00:24 crc kubenswrapper[4724]: I1002 13:00:24.939192 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-z9692" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aeccba6f-b4bb-4dd5-8ab1-798d7a67251a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://116acf2f64bac5bb00b87c2bf09f16dcec811d4effbacb2ad49e4d2656180b54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znd9t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5971fadd3a2831fd568f1a0e53cefad9dbb4a9c5ad719b4024ec27381842f92e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znd9t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T12:59:33Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-z9692\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T13:00:24Z is after 2025-08-24T17:21:41Z" Oct 02 13:00:24 crc kubenswrapper[4724]: I1002 13:00:24.954339 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8902618f9e9502ff68d0658e199abb68025c551774c41022ed9f3f15dcccba47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T13:00:24Z is after 2025-08-24T17:21:41Z" Oct 02 13:00:24 crc kubenswrapper[4724]: I1002 13:00:24.967204 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T13:00:24Z is after 2025-08-24T17:21:41Z" Oct 02 13:00:24 crc kubenswrapper[4724]: I1002 13:00:24.981698 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 13:00:24 crc kubenswrapper[4724]: I1002 13:00:24.981760 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 13:00:24 crc kubenswrapper[4724]: I1002 13:00:24.981773 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 13:00:24 crc kubenswrapper[4724]: I1002 13:00:24.981799 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 13:00:24 crc kubenswrapper[4724]: I1002 13:00:24.981814 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T13:00:24Z","lastTransitionTime":"2025-10-02T13:00:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 13:00:24 crc kubenswrapper[4724]: I1002 13:00:24.984029 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:19Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T13:00:24Z is after 2025-08-24T17:21:41Z" Oct 02 13:00:24 crc kubenswrapper[4724]: I1002 13:00:24.998437 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0ad92a13-94bc-463f-bdba-e4e6c7e97e13\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:58:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:58:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:58:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:58:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:58:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f77fba67120d9373972da96b5e1bffe2fbe5e928e877cac05d409feb94638952\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:58:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0a7cab088cdfb74745e054222be93aa9d7639f5d200570d867608b2e04f6a0e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0a7cab088cdfb74745e054222be93aa9d7639f5d200570d867608b2e04f6a0e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T12:58:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T12:58:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T12:58:56Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T13:00:24Z is after 2025-08-24T17:21:41Z" Oct 02 13:00:25 crc kubenswrapper[4724]: I1002 13:00:25.015529 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"92a0e520-2ed8-4d81-8de4-c0f98916b012\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:58:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:58:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:58:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8d0d24fc1f56963b40bca48e3e923a5b35f9bec48f0b4fabbbdc9163762b2aa6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:58:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cdf23aa5cabd12ba0af299a7184df0341b5a7f7fb6f8159a10c932599fcef08b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:58:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a087f1c52a561564ac405ef652e8cb1542077e507acff22e1189f7370b7ca322\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:58:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a675aca2218d07044b6e9a391f2b60f5cc822790011af97b0c0b704c1bd98d78\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:58:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T12:58:56Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T13:00:25Z is after 2025-08-24T17:21:41Z" Oct 02 13:00:25 crc kubenswrapper[4724]: I1002 13:00:25.031459 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-2mrjk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e0d209f5-ad55-48f5-b4de-51aa5a972c19\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e94572c741d70917793b2482faf4c7fba57dfc4a29ade8824b35109735b602c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8f48\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T12:59:21Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-2mrjk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T13:00:25Z is after 2025-08-24T17:21:41Z" Oct 02 13:00:25 crc kubenswrapper[4724]: I1002 13:00:25.050573 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-829dv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b90b9e02-3565-4ad5-8f8c-eec339fc499c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://12d1d4558bef1a0a974bf27f9547c646e216ed5001227ab980e89b05c2a7b5ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vjwcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://55a18158dd67ef4667e4a6bc9684c4f4824f848b6e151a0f99c73e5ecfcc5ab1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://55a18158dd67ef4667e4a6bc9684c4f4824f848b6e151a0f99c73e5ecfcc5ab1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T12:59:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T12:59:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vjwcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://db711940b17fde8f8a00b44de8366688eb730facc06a174bf09456c12c673eb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://db711940b17fde8f8a00b44de8366688eb730facc06a174bf09456c12c673eb0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T12:59:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T12:59:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vjwcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a04ebb8ce267454f3068833092d9644e22484d6d3ea6c89e20fe396e5ea1fe9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a04ebb8ce267454f3068833092d9644e22484d6d3ea6c89e20fe396e5ea1fe9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T12:59:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T12:59:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vjwcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://771dcee276aac3571cff4769668348ea4a10d51e792d1604a47c57edac172dec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://771dcee276aac3571cff4769668348ea4a10d51e792d1604a47c57edac172dec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T12:59:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T12:59:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vjwcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c8b3f0e84cfea43942d32d84b09db7348002648527ea7d65a716489f8507eeb5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c8b3f0e84cfea43942d32d84b09db7348002648527ea7d65a716489f8507eeb5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T12:59:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T12:59:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vjwcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1271c0f5d096f2fc134e178807b44e3a5980ca67d64f0e0d73f566b348206a55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1271c0f5d096f2fc134e178807b44e3a5980ca67d64f0e0d73f566b348206a55\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T12:59:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T12:59:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vjwcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T12:59:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-829dv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T13:00:25Z is after 2025-08-24T17:21:41Z" Oct 02 13:00:25 crc kubenswrapper[4724]: I1002 13:00:25.065809 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-74k4t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6090eaa-c182-4788-950c-16352c271233\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a3396cd890f4ef8af351a4a3065c2324c20bddc7e079bb78e5a02e9fc8269c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fb6lr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4d5dbca156adf53ca2d3b237e15b6dd4387249e73dbf1a7fb3e7e39c53d1b548\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fb6lr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T12:59:21Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-74k4t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T13:00:25Z is after 2025-08-24T17:21:41Z" Oct 02 13:00:25 crc kubenswrapper[4724]: I1002 13:00:25.077823 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-q7t2t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"32e04071-6b34-4fc0-9783-f346a72fcf99\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpmhc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpmhc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T12:59:35Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-q7t2t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T13:00:25Z is after 2025-08-24T17:21:41Z" Oct 02 13:00:25 crc kubenswrapper[4724]: I1002 13:00:25.084628 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 13:00:25 crc kubenswrapper[4724]: I1002 13:00:25.084682 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 13:00:25 crc kubenswrapper[4724]: I1002 13:00:25.084692 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 13:00:25 crc kubenswrapper[4724]: I1002 13:00:25.084709 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 13:00:25 crc kubenswrapper[4724]: I1002 13:00:25.084722 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T13:00:25Z","lastTransitionTime":"2025-10-02T13:00:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 13:00:25 crc kubenswrapper[4724]: I1002 13:00:25.092749 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7cb9bd1f-850c-4e5f-bcd0-cd9cdca06604\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:58:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:58:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:58:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4c3fabef61b20063aa2de1008b2b1fda94f39ddd4d0ad905d1a954955ffff7d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:58:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d7393e9d92fc40b10b30d788707b77d3c3e28fb075c618da4c3575dc0a1c1a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:58:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7459703a380feaeff882176dc352b69d3e13089a18d5da98bcebd41018d32091\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:58:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a8e76f0ac3479d575a0b073c5b9ad857711fc06bca31e147167babe0fb164614\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a8e76f0ac3479d575a0b073c5b9ad857711fc06bca31e147167babe0fb164614\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T12:58:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T12:58:57Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T12:58:56Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T13:00:25Z is after 2025-08-24T17:21:41Z" Oct 02 13:00:25 crc kubenswrapper[4724]: I1002 13:00:25.106860 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df76441bbfa05afa242a77e10e6f855d811a9432b790630a9fa5f359ccb6d457\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T13:00:25Z is after 2025-08-24T17:21:41Z" Oct 02 13:00:25 crc kubenswrapper[4724]: I1002 13:00:25.122180 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://16ca55ea7e67fb6219b1c835d960d1f9ec6a18a98789ee961e44a278fad9add7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://72eefb1c1aad15d7ad34aa4384c7d0a20a4c886225cb7d28f0597cf01ad35693\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T13:00:25Z is after 2025-08-24T17:21:41Z" Oct 02 13:00:25 crc kubenswrapper[4724]: I1002 13:00:25.141514 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:19Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T13:00:25Z is after 2025-08-24T17:21:41Z" Oct 02 13:00:25 crc kubenswrapper[4724]: I1002 13:00:25.158572 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1527f3c8-7fa1-404b-aafd-7cac512df49d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:58:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:58:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:58:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f058a4a1cf35b7f4800f872139322f2b0096d67548945ba03ecfe6775ffd5103\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:58:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c06801afff9aa03ae9e0bcd8a966f454d81fdfe876806eed22e9d93132989dae\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:58:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff4c45bdd946a4d50c7dc93752d3d1a71b10b830611b64e138a5e1c0eb9e5e30\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:58:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://70f8bb8f5d842643df282ff4c021f3aeba0de7dddc0864e3c1deaf64e604fccc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a3585f00b9d5a19821986d13b0d46f0db3bdd6eef3b4e28f3a63449e0aa0e55\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-02T12:59:19Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1002 12:59:13.975124 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1002 12:59:13.976423 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1894517521/tls.crt::/tmp/serving-cert-1894517521/tls.key\\\\\\\"\\\\nI1002 12:59:19.332110 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1002 12:59:19.335107 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1002 12:59:19.335132 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1002 12:59:19.335161 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1002 12:59:19.335167 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1002 12:59:19.339404 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1002 12:59:19.339433 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 12:59:19.339437 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 12:59:19.339442 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1002 12:59:19.339446 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1002 12:59:19.339452 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1002 12:59:19.339454 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1002 12:59:19.339646 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1002 12:59:19.342508 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T12:59:02Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://25e74bd1660931f7bb8dc824f985c539abcbd4c706a11edf7192876b8796e751\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:58:59Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://366ed0b9a29ba0b7606888087909b51b39abd60dd6b11e6509b8613913461b6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://366ed0b9a29ba0b7606888087909b51b39abd60dd6b11e6509b8613913461b6b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T12:58:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T12:58:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T12:58:56Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T13:00:25Z is after 2025-08-24T17:21:41Z" Oct 02 13:00:25 crc kubenswrapper[4724]: I1002 13:00:25.187961 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 13:00:25 crc kubenswrapper[4724]: I1002 13:00:25.187997 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 13:00:25 crc kubenswrapper[4724]: I1002 13:00:25.188203 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 13:00:25 crc kubenswrapper[4724]: I1002 13:00:25.188221 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 13:00:25 crc kubenswrapper[4724]: I1002 13:00:25.188235 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T13:00:25Z","lastTransitionTime":"2025-10-02T13:00:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 13:00:25 crc kubenswrapper[4724]: I1002 13:00:25.189331 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-w58lt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4089ad23-969c-4222-a8ed-e141ec291e80\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://197d101ac60dc4e22bd91f0bccaf0c5d9461d2bb94425b5c6d4ca75aad0f7e25\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sxqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3940d1762b165816101228f10689c4897ca3909b295c51368935e2d28d32e9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sxqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://300ea92580dcf51b685b378397b0dd067899d424f4394bce09a8242415acba64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sxqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6bc33e00e91c1177fe2762cb5295c78f4b00c1861d44446269e2e7668b25ac83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sxqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f4412d5281a9c03ea5b251d17371e363b9c5a970bad0db12adf3f75ddd1f955\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sxqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4186fbd2d282edbe581a4d8d3616dff86c3e6cdbe797999fa57f6034870e7742\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sxqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6c6b3b7fc0c4fc12e9d202da41be4c0a934508c22e2f4d81029e5e29ac0073f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6c6b3b7fc0c4fc12e9d202da41be4c0a934508c22e2f4d81029e5e29ac0073f1\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-02T13:00:23Z\\\",\\\"message\\\":\\\"eject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.5.189\\\\\\\", Port:50051, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI1002 13:00:23.497194 6873 services_controller.go:453] Built service openshift-marketplace/marketplace-operator-metrics template LB for network=default: []services.LB{}\\\\nF1002 13:00:23.497314 6873 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T13:00:23Z\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T13:00:22Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-w58lt_openshift-ovn-kubernetes(4089ad23-969c-4222-a8ed-e141ec291e80)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sxqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c6cd68ea9e76b6dcb1649fafb808953b30389fcaee674b611c655c239a36e7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sxqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9c7841b20007f0436754e669fb6f1a85de1f0ced1c0b2686ae0a20b56ab878f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9c7841b20007f0436754e669fb6f1a85de1f0ced1c0b2686ae0a20b56ab878f2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T12:59:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T12:59:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sxqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T12:59:21Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-w58lt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T13:00:25Z is after 2025-08-24T17:21:41Z" Oct 02 13:00:25 crc kubenswrapper[4724]: I1002 13:00:25.291116 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 13:00:25 crc kubenswrapper[4724]: I1002 13:00:25.291179 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 13:00:25 crc kubenswrapper[4724]: I1002 13:00:25.291188 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 13:00:25 crc kubenswrapper[4724]: I1002 13:00:25.291208 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 13:00:25 crc kubenswrapper[4724]: I1002 13:00:25.291221 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T13:00:25Z","lastTransitionTime":"2025-10-02T13:00:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 13:00:25 crc kubenswrapper[4724]: I1002 13:00:25.312652 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 13:00:25 crc kubenswrapper[4724]: I1002 13:00:25.312703 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-q7t2t" Oct 02 13:00:25 crc kubenswrapper[4724]: I1002 13:00:25.312703 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 13:00:25 crc kubenswrapper[4724]: I1002 13:00:25.312722 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 13:00:25 crc kubenswrapper[4724]: E1002 13:00:25.312800 4724 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 02 13:00:25 crc kubenswrapper[4724]: E1002 13:00:25.312882 4724 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 02 13:00:25 crc kubenswrapper[4724]: E1002 13:00:25.313095 4724 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-q7t2t" podUID="32e04071-6b34-4fc0-9783-f346a72fcf99" Oct 02 13:00:25 crc kubenswrapper[4724]: E1002 13:00:25.313229 4724 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 02 13:00:25 crc kubenswrapper[4724]: I1002 13:00:25.394208 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 13:00:25 crc kubenswrapper[4724]: I1002 13:00:25.394262 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 13:00:25 crc kubenswrapper[4724]: I1002 13:00:25.394274 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 13:00:25 crc kubenswrapper[4724]: I1002 13:00:25.394292 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 13:00:25 crc kubenswrapper[4724]: I1002 13:00:25.394304 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T13:00:25Z","lastTransitionTime":"2025-10-02T13:00:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 13:00:25 crc kubenswrapper[4724]: I1002 13:00:25.496939 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 13:00:25 crc kubenswrapper[4724]: I1002 13:00:25.496985 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 13:00:25 crc kubenswrapper[4724]: I1002 13:00:25.496997 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 13:00:25 crc kubenswrapper[4724]: I1002 13:00:25.497014 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 13:00:25 crc kubenswrapper[4724]: I1002 13:00:25.497026 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T13:00:25Z","lastTransitionTime":"2025-10-02T13:00:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 13:00:25 crc kubenswrapper[4724]: I1002 13:00:25.600351 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 13:00:25 crc kubenswrapper[4724]: I1002 13:00:25.601695 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 13:00:25 crc kubenswrapper[4724]: I1002 13:00:25.601782 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 13:00:25 crc kubenswrapper[4724]: I1002 13:00:25.601864 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 13:00:25 crc kubenswrapper[4724]: I1002 13:00:25.601942 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T13:00:25Z","lastTransitionTime":"2025-10-02T13:00:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 13:00:25 crc kubenswrapper[4724]: I1002 13:00:25.705420 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 13:00:25 crc kubenswrapper[4724]: I1002 13:00:25.705458 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 13:00:25 crc kubenswrapper[4724]: I1002 13:00:25.705469 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 13:00:25 crc kubenswrapper[4724]: I1002 13:00:25.705488 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 13:00:25 crc kubenswrapper[4724]: I1002 13:00:25.705500 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T13:00:25Z","lastTransitionTime":"2025-10-02T13:00:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 13:00:25 crc kubenswrapper[4724]: I1002 13:00:25.809059 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 13:00:25 crc kubenswrapper[4724]: I1002 13:00:25.809341 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 13:00:25 crc kubenswrapper[4724]: I1002 13:00:25.809424 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 13:00:25 crc kubenswrapper[4724]: I1002 13:00:25.809574 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 13:00:25 crc kubenswrapper[4724]: I1002 13:00:25.809664 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T13:00:25Z","lastTransitionTime":"2025-10-02T13:00:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 13:00:25 crc kubenswrapper[4724]: I1002 13:00:25.913021 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 13:00:25 crc kubenswrapper[4724]: I1002 13:00:25.913098 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 13:00:25 crc kubenswrapper[4724]: I1002 13:00:25.913110 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 13:00:25 crc kubenswrapper[4724]: I1002 13:00:25.913129 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 13:00:25 crc kubenswrapper[4724]: I1002 13:00:25.913142 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T13:00:25Z","lastTransitionTime":"2025-10-02T13:00:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 13:00:26 crc kubenswrapper[4724]: I1002 13:00:26.015952 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 13:00:26 crc kubenswrapper[4724]: I1002 13:00:26.016033 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 13:00:26 crc kubenswrapper[4724]: I1002 13:00:26.016048 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 13:00:26 crc kubenswrapper[4724]: I1002 13:00:26.016072 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 13:00:26 crc kubenswrapper[4724]: I1002 13:00:26.016088 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T13:00:26Z","lastTransitionTime":"2025-10-02T13:00:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 13:00:26 crc kubenswrapper[4724]: I1002 13:00:26.118694 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 13:00:26 crc kubenswrapper[4724]: I1002 13:00:26.119011 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 13:00:26 crc kubenswrapper[4724]: I1002 13:00:26.119028 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 13:00:26 crc kubenswrapper[4724]: I1002 13:00:26.119049 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 13:00:26 crc kubenswrapper[4724]: I1002 13:00:26.119060 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T13:00:26Z","lastTransitionTime":"2025-10-02T13:00:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 13:00:26 crc kubenswrapper[4724]: I1002 13:00:26.221211 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 13:00:26 crc kubenswrapper[4724]: I1002 13:00:26.221265 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 13:00:26 crc kubenswrapper[4724]: I1002 13:00:26.221276 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 13:00:26 crc kubenswrapper[4724]: I1002 13:00:26.221290 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 13:00:26 crc kubenswrapper[4724]: I1002 13:00:26.221299 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T13:00:26Z","lastTransitionTime":"2025-10-02T13:00:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 13:00:26 crc kubenswrapper[4724]: I1002 13:00:26.324360 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 13:00:26 crc kubenswrapper[4724]: I1002 13:00:26.324388 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 13:00:26 crc kubenswrapper[4724]: I1002 13:00:26.324396 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 13:00:26 crc kubenswrapper[4724]: I1002 13:00:26.324413 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 13:00:26 crc kubenswrapper[4724]: I1002 13:00:26.324424 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T13:00:26Z","lastTransitionTime":"2025-10-02T13:00:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 13:00:26 crc kubenswrapper[4724]: I1002 13:00:26.326964 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-74k4t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6090eaa-c182-4788-950c-16352c271233\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a3396cd890f4ef8af351a4a3065c2324c20bddc7e079bb78e5a02e9fc8269c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fb6lr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4d5dbca156adf53ca2d3b237e15b6dd4387249e73dbf1a7fb3e7e39c53d1b548\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fb6lr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T12:59:21Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-74k4t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T13:00:26Z is after 2025-08-24T17:21:41Z" Oct 02 13:00:26 crc kubenswrapper[4724]: I1002 13:00:26.338195 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-q7t2t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"32e04071-6b34-4fc0-9783-f346a72fcf99\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpmhc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpmhc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T12:59:35Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-q7t2t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T13:00:26Z is after 2025-08-24T17:21:41Z" Oct 02 13:00:26 crc kubenswrapper[4724]: I1002 13:00:26.350166 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7cb9bd1f-850c-4e5f-bcd0-cd9cdca06604\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:58:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:58:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:58:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4c3fabef61b20063aa2de1008b2b1fda94f39ddd4d0ad905d1a954955ffff7d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:58:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d7393e9d92fc40b10b30d788707b77d3c3e28fb075c618da4c3575dc0a1c1a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:58:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7459703a380feaeff882176dc352b69d3e13089a18d5da98bcebd41018d32091\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:58:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a8e76f0ac3479d575a0b073c5b9ad857711fc06bca31e147167babe0fb164614\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a8e76f0ac3479d575a0b073c5b9ad857711fc06bca31e147167babe0fb164614\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T12:58:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T12:58:57Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T12:58:56Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T13:00:26Z is after 2025-08-24T17:21:41Z" Oct 02 13:00:26 crc kubenswrapper[4724]: I1002 13:00:26.363942 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df76441bbfa05afa242a77e10e6f855d811a9432b790630a9fa5f359ccb6d457\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T13:00:26Z is after 2025-08-24T17:21:41Z" Oct 02 13:00:26 crc kubenswrapper[4724]: I1002 13:00:26.379707 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://16ca55ea7e67fb6219b1c835d960d1f9ec6a18a98789ee961e44a278fad9add7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://72eefb1c1aad15d7ad34aa4384c7d0a20a4c886225cb7d28f0597cf01ad35693\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T13:00:26Z is after 2025-08-24T17:21:41Z" Oct 02 13:00:26 crc kubenswrapper[4724]: I1002 13:00:26.392848 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:19Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T13:00:26Z is after 2025-08-24T17:21:41Z" Oct 02 13:00:26 crc kubenswrapper[4724]: I1002 13:00:26.408758 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-829dv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b90b9e02-3565-4ad5-8f8c-eec339fc499c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://12d1d4558bef1a0a974bf27f9547c646e216ed5001227ab980e89b05c2a7b5ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vjwcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://55a18158dd67ef4667e4a6bc9684c4f4824f848b6e151a0f99c73e5ecfcc5ab1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://55a18158dd67ef4667e4a6bc9684c4f4824f848b6e151a0f99c73e5ecfcc5ab1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T12:59:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T12:59:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vjwcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://db711940b17fde8f8a00b44de8366688eb730facc06a174bf09456c12c673eb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://db711940b17fde8f8a00b44de8366688eb730facc06a174bf09456c12c673eb0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T12:59:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T12:59:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vjwcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a04ebb8ce267454f3068833092d9644e22484d6d3ea6c89e20fe396e5ea1fe9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a04ebb8ce267454f3068833092d9644e22484d6d3ea6c89e20fe396e5ea1fe9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T12:59:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T12:59:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vjwcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://771dcee276aac3571cff4769668348ea4a10d51e792d1604a47c57edac172dec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://771dcee276aac3571cff4769668348ea4a10d51e792d1604a47c57edac172dec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T12:59:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T12:59:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vjwcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c8b3f0e84cfea43942d32d84b09db7348002648527ea7d65a716489f8507eeb5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c8b3f0e84cfea43942d32d84b09db7348002648527ea7d65a716489f8507eeb5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T12:59:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T12:59:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vjwcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1271c0f5d096f2fc134e178807b44e3a5980ca67d64f0e0d73f566b348206a55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1271c0f5d096f2fc134e178807b44e3a5980ca67d64f0e0d73f566b348206a55\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T12:59:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T12:59:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vjwcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T12:59:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-829dv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T13:00:26Z is after 2025-08-24T17:21:41Z" Oct 02 13:00:26 crc kubenswrapper[4724]: I1002 13:00:26.424776 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1527f3c8-7fa1-404b-aafd-7cac512df49d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:58:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:58:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:58:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f058a4a1cf35b7f4800f872139322f2b0096d67548945ba03ecfe6775ffd5103\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:58:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c06801afff9aa03ae9e0bcd8a966f454d81fdfe876806eed22e9d93132989dae\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:58:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff4c45bdd946a4d50c7dc93752d3d1a71b10b830611b64e138a5e1c0eb9e5e30\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:58:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://70f8bb8f5d842643df282ff4c021f3aeba0de7dddc0864e3c1deaf64e604fccc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a3585f00b9d5a19821986d13b0d46f0db3bdd6eef3b4e28f3a63449e0aa0e55\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-02T12:59:19Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1002 12:59:13.975124 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1002 12:59:13.976423 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1894517521/tls.crt::/tmp/serving-cert-1894517521/tls.key\\\\\\\"\\\\nI1002 12:59:19.332110 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1002 12:59:19.335107 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1002 12:59:19.335132 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1002 12:59:19.335161 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1002 12:59:19.335167 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1002 12:59:19.339404 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1002 12:59:19.339433 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 12:59:19.339437 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 12:59:19.339442 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1002 12:59:19.339446 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1002 12:59:19.339452 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1002 12:59:19.339454 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1002 12:59:19.339646 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1002 12:59:19.342508 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T12:59:02Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://25e74bd1660931f7bb8dc824f985c539abcbd4c706a11edf7192876b8796e751\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:58:59Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://366ed0b9a29ba0b7606888087909b51b39abd60dd6b11e6509b8613913461b6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://366ed0b9a29ba0b7606888087909b51b39abd60dd6b11e6509b8613913461b6b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T12:58:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T12:58:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T12:58:56Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T13:00:26Z is after 2025-08-24T17:21:41Z" Oct 02 13:00:26 crc kubenswrapper[4724]: I1002 13:00:26.426379 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 13:00:26 crc kubenswrapper[4724]: I1002 13:00:26.426421 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 13:00:26 crc kubenswrapper[4724]: I1002 13:00:26.426433 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 13:00:26 crc kubenswrapper[4724]: I1002 13:00:26.426452 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 13:00:26 crc kubenswrapper[4724]: I1002 13:00:26.426463 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T13:00:26Z","lastTransitionTime":"2025-10-02T13:00:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 13:00:26 crc kubenswrapper[4724]: I1002 13:00:26.447339 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-w58lt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4089ad23-969c-4222-a8ed-e141ec291e80\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://197d101ac60dc4e22bd91f0bccaf0c5d9461d2bb94425b5c6d4ca75aad0f7e25\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sxqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3940d1762b165816101228f10689c4897ca3909b295c51368935e2d28d32e9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sxqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://300ea92580dcf51b685b378397b0dd067899d424f4394bce09a8242415acba64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sxqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6bc33e00e91c1177fe2762cb5295c78f4b00c1861d44446269e2e7668b25ac83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sxqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f4412d5281a9c03ea5b251d17371e363b9c5a970bad0db12adf3f75ddd1f955\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sxqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4186fbd2d282edbe581a4d8d3616dff86c3e6cdbe797999fa57f6034870e7742\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sxqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6c6b3b7fc0c4fc12e9d202da41be4c0a934508c22e2f4d81029e5e29ac0073f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6c6b3b7fc0c4fc12e9d202da41be4c0a934508c22e2f4d81029e5e29ac0073f1\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-02T13:00:23Z\\\",\\\"message\\\":\\\"eject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.5.189\\\\\\\", Port:50051, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI1002 13:00:23.497194 6873 services_controller.go:453] Built service openshift-marketplace/marketplace-operator-metrics template LB for network=default: []services.LB{}\\\\nF1002 13:00:23.497314 6873 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T13:00:23Z\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T13:00:22Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-w58lt_openshift-ovn-kubernetes(4089ad23-969c-4222-a8ed-e141ec291e80)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sxqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c6cd68ea9e76b6dcb1649fafb808953b30389fcaee674b611c655c239a36e7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sxqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9c7841b20007f0436754e669fb6f1a85de1f0ced1c0b2686ae0a20b56ab878f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9c7841b20007f0436754e669fb6f1a85de1f0ced1c0b2686ae0a20b56ab878f2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T12:59:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T12:59:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sxqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T12:59:21Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-w58lt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T13:00:26Z is after 2025-08-24T17:21:41Z" Oct 02 13:00:26 crc kubenswrapper[4724]: I1002 13:00:26.458698 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-vv7gr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"58e6c559-83ff-48ec-b337-ddd00852bc3c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://540350e6cb29bb395f31338d08626d3b110cc3adbe0df070045d178281d9c035\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4spmh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T12:59:26Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-vv7gr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T13:00:26Z is after 2025-08-24T17:21:41Z" Oct 02 13:00:26 crc kubenswrapper[4724]: I1002 13:00:26.475235 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-z9692" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aeccba6f-b4bb-4dd5-8ab1-798d7a67251a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://116acf2f64bac5bb00b87c2bf09f16dcec811d4effbacb2ad49e4d2656180b54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znd9t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5971fadd3a2831fd568f1a0e53cefad9dbb4a9c5ad719b4024ec27381842f92e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znd9t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T12:59:33Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-z9692\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T13:00:26Z is after 2025-08-24T17:21:41Z" Oct 02 13:00:26 crc kubenswrapper[4724]: I1002 13:00:26.493322 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8902618f9e9502ff68d0658e199abb68025c551774c41022ed9f3f15dcccba47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T13:00:26Z is after 2025-08-24T17:21:41Z" Oct 02 13:00:26 crc kubenswrapper[4724]: I1002 13:00:26.508619 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T13:00:26Z is after 2025-08-24T17:21:41Z" Oct 02 13:00:26 crc kubenswrapper[4724]: I1002 13:00:26.524641 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:19Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T13:00:26Z is after 2025-08-24T17:21:41Z" Oct 02 13:00:26 crc kubenswrapper[4724]: I1002 13:00:26.528279 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 13:00:26 crc kubenswrapper[4724]: I1002 13:00:26.528306 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 13:00:26 crc kubenswrapper[4724]: I1002 13:00:26.528314 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 13:00:26 crc kubenswrapper[4724]: I1002 13:00:26.528328 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 13:00:26 crc kubenswrapper[4724]: I1002 13:00:26.528338 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T13:00:26Z","lastTransitionTime":"2025-10-02T13:00:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 13:00:26 crc kubenswrapper[4724]: I1002 13:00:26.540241 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-pr276" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c67e3c27-01fd-47f5-a7fb-a2ae1e7753d4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T13:00:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T13:00:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6c546eb3faa335d37ee1c08a88b2d409a5ee23ed42ca9fd7104fb0bb76ecf15\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5fe9713cb9b004911140ef4802a6755a4d2a781fde439d7318ace93e4b608cef\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-02T13:00:09Z\\\",\\\"message\\\":\\\"2025-10-02T12:59:23+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_52b6ea56-510d-4c4a-86ca-f2d8b03a329f\\\\n2025-10-02T12:59:23+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_52b6ea56-510d-4c4a-86ca-f2d8b03a329f to /host/opt/cni/bin/\\\\n2025-10-02T12:59:24Z [verbose] multus-daemon started\\\\n2025-10-02T12:59:24Z [verbose] Readiness Indicator file check\\\\n2025-10-02T13:00:09Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T12:59:22Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T13:00:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c844q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T12:59:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-pr276\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T13:00:26Z is after 2025-08-24T17:21:41Z" Oct 02 13:00:26 crc kubenswrapper[4724]: I1002 13:00:26.551518 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0ad92a13-94bc-463f-bdba-e4e6c7e97e13\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:58:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:58:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:58:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:58:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:58:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f77fba67120d9373972da96b5e1bffe2fbe5e928e877cac05d409feb94638952\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:58:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0a7cab088cdfb74745e054222be93aa9d7639f5d200570d867608b2e04f6a0e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0a7cab088cdfb74745e054222be93aa9d7639f5d200570d867608b2e04f6a0e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T12:58:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T12:58:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T12:58:56Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T13:00:26Z is after 2025-08-24T17:21:41Z" Oct 02 13:00:26 crc kubenswrapper[4724]: I1002 13:00:26.566081 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"92a0e520-2ed8-4d81-8de4-c0f98916b012\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:58:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:58:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:58:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8d0d24fc1f56963b40bca48e3e923a5b35f9bec48f0b4fabbbdc9163762b2aa6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:58:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cdf23aa5cabd12ba0af299a7184df0341b5a7f7fb6f8159a10c932599fcef08b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:58:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a087f1c52a561564ac405ef652e8cb1542077e507acff22e1189f7370b7ca322\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:58:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a675aca2218d07044b6e9a391f2b60f5cc822790011af97b0c0b704c1bd98d78\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:58:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T12:58:56Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T13:00:26Z is after 2025-08-24T17:21:41Z" Oct 02 13:00:26 crc kubenswrapper[4724]: I1002 13:00:26.579197 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-2mrjk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e0d209f5-ad55-48f5-b4de-51aa5a972c19\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e94572c741d70917793b2482faf4c7fba57dfc4a29ade8824b35109735b602c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8f48\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T12:59:21Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-2mrjk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T13:00:26Z is after 2025-08-24T17:21:41Z" Oct 02 13:00:26 crc kubenswrapper[4724]: I1002 13:00:26.630507 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 13:00:26 crc kubenswrapper[4724]: I1002 13:00:26.630577 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 13:00:26 crc kubenswrapper[4724]: I1002 13:00:26.630592 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 13:00:26 crc kubenswrapper[4724]: I1002 13:00:26.630610 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 13:00:26 crc kubenswrapper[4724]: I1002 13:00:26.630624 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T13:00:26Z","lastTransitionTime":"2025-10-02T13:00:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 13:00:26 crc kubenswrapper[4724]: I1002 13:00:26.732762 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 13:00:26 crc kubenswrapper[4724]: I1002 13:00:26.732815 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 13:00:26 crc kubenswrapper[4724]: I1002 13:00:26.732850 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 13:00:26 crc kubenswrapper[4724]: I1002 13:00:26.733168 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 13:00:26 crc kubenswrapper[4724]: I1002 13:00:26.733219 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T13:00:26Z","lastTransitionTime":"2025-10-02T13:00:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 13:00:26 crc kubenswrapper[4724]: I1002 13:00:26.839187 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 13:00:26 crc kubenswrapper[4724]: I1002 13:00:26.839231 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 13:00:26 crc kubenswrapper[4724]: I1002 13:00:26.839246 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 13:00:26 crc kubenswrapper[4724]: I1002 13:00:26.839265 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 13:00:26 crc kubenswrapper[4724]: I1002 13:00:26.839276 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T13:00:26Z","lastTransitionTime":"2025-10-02T13:00:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 13:00:26 crc kubenswrapper[4724]: I1002 13:00:26.941618 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 13:00:26 crc kubenswrapper[4724]: I1002 13:00:26.941666 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 13:00:26 crc kubenswrapper[4724]: I1002 13:00:26.941684 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 13:00:26 crc kubenswrapper[4724]: I1002 13:00:26.941706 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 13:00:26 crc kubenswrapper[4724]: I1002 13:00:26.941718 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T13:00:26Z","lastTransitionTime":"2025-10-02T13:00:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 13:00:27 crc kubenswrapper[4724]: I1002 13:00:27.046473 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 13:00:27 crc kubenswrapper[4724]: I1002 13:00:27.046528 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 13:00:27 crc kubenswrapper[4724]: I1002 13:00:27.046554 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 13:00:27 crc kubenswrapper[4724]: I1002 13:00:27.046569 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 13:00:27 crc kubenswrapper[4724]: I1002 13:00:27.046578 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T13:00:27Z","lastTransitionTime":"2025-10-02T13:00:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 13:00:27 crc kubenswrapper[4724]: I1002 13:00:27.148946 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 13:00:27 crc kubenswrapper[4724]: I1002 13:00:27.148996 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 13:00:27 crc kubenswrapper[4724]: I1002 13:00:27.149011 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 13:00:27 crc kubenswrapper[4724]: I1002 13:00:27.149032 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 13:00:27 crc kubenswrapper[4724]: I1002 13:00:27.149050 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T13:00:27Z","lastTransitionTime":"2025-10-02T13:00:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 13:00:27 crc kubenswrapper[4724]: I1002 13:00:27.252213 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 13:00:27 crc kubenswrapper[4724]: I1002 13:00:27.252247 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 13:00:27 crc kubenswrapper[4724]: I1002 13:00:27.252254 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 13:00:27 crc kubenswrapper[4724]: I1002 13:00:27.252269 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 13:00:27 crc kubenswrapper[4724]: I1002 13:00:27.252279 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T13:00:27Z","lastTransitionTime":"2025-10-02T13:00:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 13:00:27 crc kubenswrapper[4724]: I1002 13:00:27.313350 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 13:00:27 crc kubenswrapper[4724]: I1002 13:00:27.313435 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-q7t2t" Oct 02 13:00:27 crc kubenswrapper[4724]: I1002 13:00:27.313373 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 13:00:27 crc kubenswrapper[4724]: E1002 13:00:27.313519 4724 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 02 13:00:27 crc kubenswrapper[4724]: I1002 13:00:27.313606 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 13:00:27 crc kubenswrapper[4724]: E1002 13:00:27.313747 4724 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-q7t2t" podUID="32e04071-6b34-4fc0-9783-f346a72fcf99" Oct 02 13:00:27 crc kubenswrapper[4724]: E1002 13:00:27.313847 4724 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 02 13:00:27 crc kubenswrapper[4724]: E1002 13:00:27.313935 4724 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 02 13:00:27 crc kubenswrapper[4724]: I1002 13:00:27.354847 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 13:00:27 crc kubenswrapper[4724]: I1002 13:00:27.354897 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 13:00:27 crc kubenswrapper[4724]: I1002 13:00:27.354908 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 13:00:27 crc kubenswrapper[4724]: I1002 13:00:27.354925 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 13:00:27 crc kubenswrapper[4724]: I1002 13:00:27.354936 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T13:00:27Z","lastTransitionTime":"2025-10-02T13:00:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 13:00:27 crc kubenswrapper[4724]: I1002 13:00:27.457614 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 13:00:27 crc kubenswrapper[4724]: I1002 13:00:27.457663 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 13:00:27 crc kubenswrapper[4724]: I1002 13:00:27.457675 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 13:00:27 crc kubenswrapper[4724]: I1002 13:00:27.457694 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 13:00:27 crc kubenswrapper[4724]: I1002 13:00:27.457707 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T13:00:27Z","lastTransitionTime":"2025-10-02T13:00:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 13:00:27 crc kubenswrapper[4724]: I1002 13:00:27.560802 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 13:00:27 crc kubenswrapper[4724]: I1002 13:00:27.560843 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 13:00:27 crc kubenswrapper[4724]: I1002 13:00:27.560852 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 13:00:27 crc kubenswrapper[4724]: I1002 13:00:27.560871 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 13:00:27 crc kubenswrapper[4724]: I1002 13:00:27.560883 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T13:00:27Z","lastTransitionTime":"2025-10-02T13:00:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 13:00:27 crc kubenswrapper[4724]: I1002 13:00:27.663247 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 13:00:27 crc kubenswrapper[4724]: I1002 13:00:27.663298 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 13:00:27 crc kubenswrapper[4724]: I1002 13:00:27.663311 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 13:00:27 crc kubenswrapper[4724]: I1002 13:00:27.663330 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 13:00:27 crc kubenswrapper[4724]: I1002 13:00:27.663343 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T13:00:27Z","lastTransitionTime":"2025-10-02T13:00:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 13:00:27 crc kubenswrapper[4724]: I1002 13:00:27.765893 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 13:00:27 crc kubenswrapper[4724]: I1002 13:00:27.765951 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 13:00:27 crc kubenswrapper[4724]: I1002 13:00:27.765964 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 13:00:27 crc kubenswrapper[4724]: I1002 13:00:27.765985 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 13:00:27 crc kubenswrapper[4724]: I1002 13:00:27.765998 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T13:00:27Z","lastTransitionTime":"2025-10-02T13:00:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 13:00:27 crc kubenswrapper[4724]: I1002 13:00:27.817288 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 13:00:27 crc kubenswrapper[4724]: I1002 13:00:27.817337 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 13:00:27 crc kubenswrapper[4724]: I1002 13:00:27.817347 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 13:00:27 crc kubenswrapper[4724]: I1002 13:00:27.817367 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 13:00:27 crc kubenswrapper[4724]: I1002 13:00:27.817380 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T13:00:27Z","lastTransitionTime":"2025-10-02T13:00:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 13:00:27 crc kubenswrapper[4724]: E1002 13:00:27.830224 4724 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T13:00:27Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T13:00:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T13:00:27Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T13:00:27Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T13:00:27Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T13:00:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T13:00:27Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T13:00:27Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"1650a7ad-7086-4fb1-9f9b-b9368db16424\\\",\\\"systemUUID\\\":\\\"5e560baf-345b-4d65-984c-1cfbf6a74dd2\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T13:00:27Z is after 2025-08-24T17:21:41Z" Oct 02 13:00:27 crc kubenswrapper[4724]: I1002 13:00:27.834341 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 13:00:27 crc kubenswrapper[4724]: I1002 13:00:27.834387 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 13:00:27 crc kubenswrapper[4724]: I1002 13:00:27.834398 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 13:00:27 crc kubenswrapper[4724]: I1002 13:00:27.834417 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 13:00:27 crc kubenswrapper[4724]: I1002 13:00:27.834430 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T13:00:27Z","lastTransitionTime":"2025-10-02T13:00:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 13:00:27 crc kubenswrapper[4724]: E1002 13:00:27.848690 4724 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T13:00:27Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T13:00:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T13:00:27Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T13:00:27Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T13:00:27Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T13:00:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T13:00:27Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T13:00:27Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"1650a7ad-7086-4fb1-9f9b-b9368db16424\\\",\\\"systemUUID\\\":\\\"5e560baf-345b-4d65-984c-1cfbf6a74dd2\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T13:00:27Z is after 2025-08-24T17:21:41Z" Oct 02 13:00:27 crc kubenswrapper[4724]: I1002 13:00:27.855373 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 13:00:27 crc kubenswrapper[4724]: I1002 13:00:27.856013 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 13:00:27 crc kubenswrapper[4724]: I1002 13:00:27.856108 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 13:00:27 crc kubenswrapper[4724]: I1002 13:00:27.856129 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 13:00:27 crc kubenswrapper[4724]: I1002 13:00:27.856158 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T13:00:27Z","lastTransitionTime":"2025-10-02T13:00:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 13:00:27 crc kubenswrapper[4724]: E1002 13:00:27.870123 4724 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T13:00:27Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T13:00:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T13:00:27Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T13:00:27Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T13:00:27Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T13:00:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T13:00:27Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T13:00:27Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"1650a7ad-7086-4fb1-9f9b-b9368db16424\\\",\\\"systemUUID\\\":\\\"5e560baf-345b-4d65-984c-1cfbf6a74dd2\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T13:00:27Z is after 2025-08-24T17:21:41Z" Oct 02 13:00:27 crc kubenswrapper[4724]: I1002 13:00:27.874160 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 13:00:27 crc kubenswrapper[4724]: I1002 13:00:27.874187 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 13:00:27 crc kubenswrapper[4724]: I1002 13:00:27.874196 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 13:00:27 crc kubenswrapper[4724]: I1002 13:00:27.874210 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 13:00:27 crc kubenswrapper[4724]: I1002 13:00:27.874220 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T13:00:27Z","lastTransitionTime":"2025-10-02T13:00:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 13:00:27 crc kubenswrapper[4724]: E1002 13:00:27.887221 4724 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T13:00:27Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T13:00:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T13:00:27Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T13:00:27Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T13:00:27Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T13:00:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T13:00:27Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T13:00:27Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"1650a7ad-7086-4fb1-9f9b-b9368db16424\\\",\\\"systemUUID\\\":\\\"5e560baf-345b-4d65-984c-1cfbf6a74dd2\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T13:00:27Z is after 2025-08-24T17:21:41Z" Oct 02 13:00:27 crc kubenswrapper[4724]: I1002 13:00:27.891405 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 13:00:27 crc kubenswrapper[4724]: I1002 13:00:27.891435 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 13:00:27 crc kubenswrapper[4724]: I1002 13:00:27.891444 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 13:00:27 crc kubenswrapper[4724]: I1002 13:00:27.891460 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 13:00:27 crc kubenswrapper[4724]: I1002 13:00:27.891471 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T13:00:27Z","lastTransitionTime":"2025-10-02T13:00:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 13:00:27 crc kubenswrapper[4724]: E1002 13:00:27.906100 4724 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T13:00:27Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T13:00:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T13:00:27Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T13:00:27Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T13:00:27Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T13:00:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T13:00:27Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T13:00:27Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"1650a7ad-7086-4fb1-9f9b-b9368db16424\\\",\\\"systemUUID\\\":\\\"5e560baf-345b-4d65-984c-1cfbf6a74dd2\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T13:00:27Z is after 2025-08-24T17:21:41Z" Oct 02 13:00:27 crc kubenswrapper[4724]: E1002 13:00:27.906244 4724 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Oct 02 13:00:27 crc kubenswrapper[4724]: I1002 13:00:27.907971 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 13:00:27 crc kubenswrapper[4724]: I1002 13:00:27.908004 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 13:00:27 crc kubenswrapper[4724]: I1002 13:00:27.908014 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 13:00:27 crc kubenswrapper[4724]: I1002 13:00:27.908029 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 13:00:27 crc kubenswrapper[4724]: I1002 13:00:27.908062 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T13:00:27Z","lastTransitionTime":"2025-10-02T13:00:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 13:00:28 crc kubenswrapper[4724]: I1002 13:00:28.011504 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 13:00:28 crc kubenswrapper[4724]: I1002 13:00:28.011585 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 13:00:28 crc kubenswrapper[4724]: I1002 13:00:28.011598 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 13:00:28 crc kubenswrapper[4724]: I1002 13:00:28.011618 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 13:00:28 crc kubenswrapper[4724]: I1002 13:00:28.011631 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T13:00:28Z","lastTransitionTime":"2025-10-02T13:00:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 13:00:28 crc kubenswrapper[4724]: I1002 13:00:28.115906 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 13:00:28 crc kubenswrapper[4724]: I1002 13:00:28.115946 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 13:00:28 crc kubenswrapper[4724]: I1002 13:00:28.115955 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 13:00:28 crc kubenswrapper[4724]: I1002 13:00:28.115970 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 13:00:28 crc kubenswrapper[4724]: I1002 13:00:28.115980 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T13:00:28Z","lastTransitionTime":"2025-10-02T13:00:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 13:00:28 crc kubenswrapper[4724]: I1002 13:00:28.218106 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 13:00:28 crc kubenswrapper[4724]: I1002 13:00:28.218156 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 13:00:28 crc kubenswrapper[4724]: I1002 13:00:28.218176 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 13:00:28 crc kubenswrapper[4724]: I1002 13:00:28.218196 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 13:00:28 crc kubenswrapper[4724]: I1002 13:00:28.218207 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T13:00:28Z","lastTransitionTime":"2025-10-02T13:00:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 13:00:28 crc kubenswrapper[4724]: I1002 13:00:28.320093 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 13:00:28 crc kubenswrapper[4724]: I1002 13:00:28.320172 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 13:00:28 crc kubenswrapper[4724]: I1002 13:00:28.320187 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 13:00:28 crc kubenswrapper[4724]: I1002 13:00:28.320203 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 13:00:28 crc kubenswrapper[4724]: I1002 13:00:28.320216 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T13:00:28Z","lastTransitionTime":"2025-10-02T13:00:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 13:00:28 crc kubenswrapper[4724]: I1002 13:00:28.422969 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 13:00:28 crc kubenswrapper[4724]: I1002 13:00:28.423014 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 13:00:28 crc kubenswrapper[4724]: I1002 13:00:28.423025 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 13:00:28 crc kubenswrapper[4724]: I1002 13:00:28.423042 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 13:00:28 crc kubenswrapper[4724]: I1002 13:00:28.423052 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T13:00:28Z","lastTransitionTime":"2025-10-02T13:00:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 13:00:28 crc kubenswrapper[4724]: I1002 13:00:28.525519 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 13:00:28 crc kubenswrapper[4724]: I1002 13:00:28.525595 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 13:00:28 crc kubenswrapper[4724]: I1002 13:00:28.525606 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 13:00:28 crc kubenswrapper[4724]: I1002 13:00:28.525626 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 13:00:28 crc kubenswrapper[4724]: I1002 13:00:28.525639 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T13:00:28Z","lastTransitionTime":"2025-10-02T13:00:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 13:00:28 crc kubenswrapper[4724]: I1002 13:00:28.628993 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 13:00:28 crc kubenswrapper[4724]: I1002 13:00:28.629061 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 13:00:28 crc kubenswrapper[4724]: I1002 13:00:28.629076 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 13:00:28 crc kubenswrapper[4724]: I1002 13:00:28.629100 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 13:00:28 crc kubenswrapper[4724]: I1002 13:00:28.629114 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T13:00:28Z","lastTransitionTime":"2025-10-02T13:00:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 13:00:28 crc kubenswrapper[4724]: I1002 13:00:28.736166 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 13:00:28 crc kubenswrapper[4724]: I1002 13:00:28.736263 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 13:00:28 crc kubenswrapper[4724]: I1002 13:00:28.736285 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 13:00:28 crc kubenswrapper[4724]: I1002 13:00:28.736311 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 13:00:28 crc kubenswrapper[4724]: I1002 13:00:28.736330 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T13:00:28Z","lastTransitionTime":"2025-10-02T13:00:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 13:00:28 crc kubenswrapper[4724]: I1002 13:00:28.839868 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 13:00:28 crc kubenswrapper[4724]: I1002 13:00:28.839924 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 13:00:28 crc kubenswrapper[4724]: I1002 13:00:28.839934 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 13:00:28 crc kubenswrapper[4724]: I1002 13:00:28.839991 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 13:00:28 crc kubenswrapper[4724]: I1002 13:00:28.840004 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T13:00:28Z","lastTransitionTime":"2025-10-02T13:00:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 13:00:28 crc kubenswrapper[4724]: I1002 13:00:28.942610 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 13:00:28 crc kubenswrapper[4724]: I1002 13:00:28.942653 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 13:00:28 crc kubenswrapper[4724]: I1002 13:00:28.942662 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 13:00:28 crc kubenswrapper[4724]: I1002 13:00:28.942678 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 13:00:28 crc kubenswrapper[4724]: I1002 13:00:28.942689 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T13:00:28Z","lastTransitionTime":"2025-10-02T13:00:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 13:00:29 crc kubenswrapper[4724]: I1002 13:00:29.045251 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 13:00:29 crc kubenswrapper[4724]: I1002 13:00:29.045341 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 13:00:29 crc kubenswrapper[4724]: I1002 13:00:29.045355 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 13:00:29 crc kubenswrapper[4724]: I1002 13:00:29.045376 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 13:00:29 crc kubenswrapper[4724]: I1002 13:00:29.045388 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T13:00:29Z","lastTransitionTime":"2025-10-02T13:00:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 13:00:29 crc kubenswrapper[4724]: I1002 13:00:29.148851 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 13:00:29 crc kubenswrapper[4724]: I1002 13:00:29.148901 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 13:00:29 crc kubenswrapper[4724]: I1002 13:00:29.148911 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 13:00:29 crc kubenswrapper[4724]: I1002 13:00:29.148929 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 13:00:29 crc kubenswrapper[4724]: I1002 13:00:29.148941 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T13:00:29Z","lastTransitionTime":"2025-10-02T13:00:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 13:00:29 crc kubenswrapper[4724]: I1002 13:00:29.252297 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 13:00:29 crc kubenswrapper[4724]: I1002 13:00:29.252336 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 13:00:29 crc kubenswrapper[4724]: I1002 13:00:29.252348 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 13:00:29 crc kubenswrapper[4724]: I1002 13:00:29.252390 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 13:00:29 crc kubenswrapper[4724]: I1002 13:00:29.252403 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T13:00:29Z","lastTransitionTime":"2025-10-02T13:00:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 13:00:29 crc kubenswrapper[4724]: I1002 13:00:29.313576 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 13:00:29 crc kubenswrapper[4724]: I1002 13:00:29.313635 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 13:00:29 crc kubenswrapper[4724]: I1002 13:00:29.313673 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-q7t2t" Oct 02 13:00:29 crc kubenswrapper[4724]: I1002 13:00:29.313694 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 13:00:29 crc kubenswrapper[4724]: E1002 13:00:29.313793 4724 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 02 13:00:29 crc kubenswrapper[4724]: E1002 13:00:29.313872 4724 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 02 13:00:29 crc kubenswrapper[4724]: E1002 13:00:29.313996 4724 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 02 13:00:29 crc kubenswrapper[4724]: E1002 13:00:29.314100 4724 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-q7t2t" podUID="32e04071-6b34-4fc0-9783-f346a72fcf99" Oct 02 13:00:29 crc kubenswrapper[4724]: I1002 13:00:29.394612 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 13:00:29 crc kubenswrapper[4724]: I1002 13:00:29.394679 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 13:00:29 crc kubenswrapper[4724]: I1002 13:00:29.394688 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 13:00:29 crc kubenswrapper[4724]: I1002 13:00:29.394710 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 13:00:29 crc kubenswrapper[4724]: I1002 13:00:29.394723 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T13:00:29Z","lastTransitionTime":"2025-10-02T13:00:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 13:00:29 crc kubenswrapper[4724]: I1002 13:00:29.497654 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 13:00:29 crc kubenswrapper[4724]: I1002 13:00:29.497690 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 13:00:29 crc kubenswrapper[4724]: I1002 13:00:29.497699 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 13:00:29 crc kubenswrapper[4724]: I1002 13:00:29.497714 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 13:00:29 crc kubenswrapper[4724]: I1002 13:00:29.497726 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T13:00:29Z","lastTransitionTime":"2025-10-02T13:00:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 13:00:29 crc kubenswrapper[4724]: I1002 13:00:29.599737 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 13:00:29 crc kubenswrapper[4724]: I1002 13:00:29.599768 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 13:00:29 crc kubenswrapper[4724]: I1002 13:00:29.599776 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 13:00:29 crc kubenswrapper[4724]: I1002 13:00:29.599790 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 13:00:29 crc kubenswrapper[4724]: I1002 13:00:29.599800 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T13:00:29Z","lastTransitionTime":"2025-10-02T13:00:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 13:00:29 crc kubenswrapper[4724]: I1002 13:00:29.702939 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 13:00:29 crc kubenswrapper[4724]: I1002 13:00:29.702973 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 13:00:29 crc kubenswrapper[4724]: I1002 13:00:29.702983 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 13:00:29 crc kubenswrapper[4724]: I1002 13:00:29.702999 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 13:00:29 crc kubenswrapper[4724]: I1002 13:00:29.703009 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T13:00:29Z","lastTransitionTime":"2025-10-02T13:00:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 13:00:29 crc kubenswrapper[4724]: I1002 13:00:29.805845 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 13:00:29 crc kubenswrapper[4724]: I1002 13:00:29.805894 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 13:00:29 crc kubenswrapper[4724]: I1002 13:00:29.805906 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 13:00:29 crc kubenswrapper[4724]: I1002 13:00:29.805925 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 13:00:29 crc kubenswrapper[4724]: I1002 13:00:29.805935 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T13:00:29Z","lastTransitionTime":"2025-10-02T13:00:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 13:00:29 crc kubenswrapper[4724]: I1002 13:00:29.908883 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 13:00:29 crc kubenswrapper[4724]: I1002 13:00:29.908934 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 13:00:29 crc kubenswrapper[4724]: I1002 13:00:29.908945 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 13:00:29 crc kubenswrapper[4724]: I1002 13:00:29.908965 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 13:00:29 crc kubenswrapper[4724]: I1002 13:00:29.908979 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T13:00:29Z","lastTransitionTime":"2025-10-02T13:00:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 13:00:30 crc kubenswrapper[4724]: I1002 13:00:30.012118 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 13:00:30 crc kubenswrapper[4724]: I1002 13:00:30.012169 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 13:00:30 crc kubenswrapper[4724]: I1002 13:00:30.012184 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 13:00:30 crc kubenswrapper[4724]: I1002 13:00:30.012206 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 13:00:30 crc kubenswrapper[4724]: I1002 13:00:30.012221 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T13:00:30Z","lastTransitionTime":"2025-10-02T13:00:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 13:00:30 crc kubenswrapper[4724]: I1002 13:00:30.114524 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 13:00:30 crc kubenswrapper[4724]: I1002 13:00:30.114577 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 13:00:30 crc kubenswrapper[4724]: I1002 13:00:30.114585 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 13:00:30 crc kubenswrapper[4724]: I1002 13:00:30.114603 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 13:00:30 crc kubenswrapper[4724]: I1002 13:00:30.114612 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T13:00:30Z","lastTransitionTime":"2025-10-02T13:00:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 13:00:30 crc kubenswrapper[4724]: I1002 13:00:30.217882 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 13:00:30 crc kubenswrapper[4724]: I1002 13:00:30.217945 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 13:00:30 crc kubenswrapper[4724]: I1002 13:00:30.217985 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 13:00:30 crc kubenswrapper[4724]: I1002 13:00:30.218003 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 13:00:30 crc kubenswrapper[4724]: I1002 13:00:30.218012 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T13:00:30Z","lastTransitionTime":"2025-10-02T13:00:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 13:00:30 crc kubenswrapper[4724]: I1002 13:00:30.321338 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 13:00:30 crc kubenswrapper[4724]: I1002 13:00:30.321402 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 13:00:30 crc kubenswrapper[4724]: I1002 13:00:30.321417 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 13:00:30 crc kubenswrapper[4724]: I1002 13:00:30.321485 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 13:00:30 crc kubenswrapper[4724]: I1002 13:00:30.321512 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T13:00:30Z","lastTransitionTime":"2025-10-02T13:00:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 13:00:30 crc kubenswrapper[4724]: I1002 13:00:30.424686 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 13:00:30 crc kubenswrapper[4724]: I1002 13:00:30.424725 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 13:00:30 crc kubenswrapper[4724]: I1002 13:00:30.424735 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 13:00:30 crc kubenswrapper[4724]: I1002 13:00:30.424754 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 13:00:30 crc kubenswrapper[4724]: I1002 13:00:30.424768 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T13:00:30Z","lastTransitionTime":"2025-10-02T13:00:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 13:00:30 crc kubenswrapper[4724]: I1002 13:00:30.527563 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 13:00:30 crc kubenswrapper[4724]: I1002 13:00:30.527603 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 13:00:30 crc kubenswrapper[4724]: I1002 13:00:30.527615 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 13:00:30 crc kubenswrapper[4724]: I1002 13:00:30.527633 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 13:00:30 crc kubenswrapper[4724]: I1002 13:00:30.527643 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T13:00:30Z","lastTransitionTime":"2025-10-02T13:00:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 13:00:30 crc kubenswrapper[4724]: I1002 13:00:30.630489 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 13:00:30 crc kubenswrapper[4724]: I1002 13:00:30.630593 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 13:00:30 crc kubenswrapper[4724]: I1002 13:00:30.630603 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 13:00:30 crc kubenswrapper[4724]: I1002 13:00:30.630625 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 13:00:30 crc kubenswrapper[4724]: I1002 13:00:30.630639 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T13:00:30Z","lastTransitionTime":"2025-10-02T13:00:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 13:00:30 crc kubenswrapper[4724]: I1002 13:00:30.733979 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 13:00:30 crc kubenswrapper[4724]: I1002 13:00:30.734027 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 13:00:30 crc kubenswrapper[4724]: I1002 13:00:30.734037 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 13:00:30 crc kubenswrapper[4724]: I1002 13:00:30.734053 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 13:00:30 crc kubenswrapper[4724]: I1002 13:00:30.734062 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T13:00:30Z","lastTransitionTime":"2025-10-02T13:00:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 13:00:30 crc kubenswrapper[4724]: I1002 13:00:30.836998 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 13:00:30 crc kubenswrapper[4724]: I1002 13:00:30.837063 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 13:00:30 crc kubenswrapper[4724]: I1002 13:00:30.837074 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 13:00:30 crc kubenswrapper[4724]: I1002 13:00:30.837095 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 13:00:30 crc kubenswrapper[4724]: I1002 13:00:30.837109 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T13:00:30Z","lastTransitionTime":"2025-10-02T13:00:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 13:00:30 crc kubenswrapper[4724]: I1002 13:00:30.939479 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 13:00:30 crc kubenswrapper[4724]: I1002 13:00:30.939523 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 13:00:30 crc kubenswrapper[4724]: I1002 13:00:30.939550 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 13:00:30 crc kubenswrapper[4724]: I1002 13:00:30.939569 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 13:00:30 crc kubenswrapper[4724]: I1002 13:00:30.939582 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T13:00:30Z","lastTransitionTime":"2025-10-02T13:00:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 13:00:31 crc kubenswrapper[4724]: I1002 13:00:31.042867 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 13:00:31 crc kubenswrapper[4724]: I1002 13:00:31.042908 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 13:00:31 crc kubenswrapper[4724]: I1002 13:00:31.042918 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 13:00:31 crc kubenswrapper[4724]: I1002 13:00:31.042935 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 13:00:31 crc kubenswrapper[4724]: I1002 13:00:31.042947 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T13:00:31Z","lastTransitionTime":"2025-10-02T13:00:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 13:00:31 crc kubenswrapper[4724]: I1002 13:00:31.145948 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 13:00:31 crc kubenswrapper[4724]: I1002 13:00:31.145987 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 13:00:31 crc kubenswrapper[4724]: I1002 13:00:31.145998 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 13:00:31 crc kubenswrapper[4724]: I1002 13:00:31.146984 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 13:00:31 crc kubenswrapper[4724]: I1002 13:00:31.147029 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T13:00:31Z","lastTransitionTime":"2025-10-02T13:00:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 13:00:31 crc kubenswrapper[4724]: I1002 13:00:31.249756 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 13:00:31 crc kubenswrapper[4724]: I1002 13:00:31.249785 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 13:00:31 crc kubenswrapper[4724]: I1002 13:00:31.249793 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 13:00:31 crc kubenswrapper[4724]: I1002 13:00:31.249807 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 13:00:31 crc kubenswrapper[4724]: I1002 13:00:31.249817 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T13:00:31Z","lastTransitionTime":"2025-10-02T13:00:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 13:00:31 crc kubenswrapper[4724]: I1002 13:00:31.313359 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 13:00:31 crc kubenswrapper[4724]: I1002 13:00:31.313368 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 13:00:31 crc kubenswrapper[4724]: E1002 13:00:31.313811 4724 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 02 13:00:31 crc kubenswrapper[4724]: I1002 13:00:31.313461 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-q7t2t" Oct 02 13:00:31 crc kubenswrapper[4724]: E1002 13:00:31.314193 4724 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-q7t2t" podUID="32e04071-6b34-4fc0-9783-f346a72fcf99" Oct 02 13:00:31 crc kubenswrapper[4724]: I1002 13:00:31.313424 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 13:00:31 crc kubenswrapper[4724]: E1002 13:00:31.314395 4724 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 02 13:00:31 crc kubenswrapper[4724]: E1002 13:00:31.313964 4724 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 02 13:00:31 crc kubenswrapper[4724]: I1002 13:00:31.352673 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 13:00:31 crc kubenswrapper[4724]: I1002 13:00:31.352726 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 13:00:31 crc kubenswrapper[4724]: I1002 13:00:31.352739 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 13:00:31 crc kubenswrapper[4724]: I1002 13:00:31.352760 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 13:00:31 crc kubenswrapper[4724]: I1002 13:00:31.352775 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T13:00:31Z","lastTransitionTime":"2025-10-02T13:00:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 13:00:31 crc kubenswrapper[4724]: I1002 13:00:31.455820 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 13:00:31 crc kubenswrapper[4724]: I1002 13:00:31.455869 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 13:00:31 crc kubenswrapper[4724]: I1002 13:00:31.455881 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 13:00:31 crc kubenswrapper[4724]: I1002 13:00:31.455900 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 13:00:31 crc kubenswrapper[4724]: I1002 13:00:31.455912 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T13:00:31Z","lastTransitionTime":"2025-10-02T13:00:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 13:00:31 crc kubenswrapper[4724]: I1002 13:00:31.571228 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 13:00:31 crc kubenswrapper[4724]: I1002 13:00:31.571277 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 13:00:31 crc kubenswrapper[4724]: I1002 13:00:31.571291 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 13:00:31 crc kubenswrapper[4724]: I1002 13:00:31.571309 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 13:00:31 crc kubenswrapper[4724]: I1002 13:00:31.571319 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T13:00:31Z","lastTransitionTime":"2025-10-02T13:00:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 13:00:31 crc kubenswrapper[4724]: I1002 13:00:31.673459 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 13:00:31 crc kubenswrapper[4724]: I1002 13:00:31.673497 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 13:00:31 crc kubenswrapper[4724]: I1002 13:00:31.673507 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 13:00:31 crc kubenswrapper[4724]: I1002 13:00:31.673522 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 13:00:31 crc kubenswrapper[4724]: I1002 13:00:31.673547 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T13:00:31Z","lastTransitionTime":"2025-10-02T13:00:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 13:00:31 crc kubenswrapper[4724]: I1002 13:00:31.775904 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 13:00:31 crc kubenswrapper[4724]: I1002 13:00:31.775964 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 13:00:31 crc kubenswrapper[4724]: I1002 13:00:31.775974 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 13:00:31 crc kubenswrapper[4724]: I1002 13:00:31.775990 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 13:00:31 crc kubenswrapper[4724]: I1002 13:00:31.776000 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T13:00:31Z","lastTransitionTime":"2025-10-02T13:00:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 13:00:31 crc kubenswrapper[4724]: I1002 13:00:31.878670 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 13:00:31 crc kubenswrapper[4724]: I1002 13:00:31.878715 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 13:00:31 crc kubenswrapper[4724]: I1002 13:00:31.878728 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 13:00:31 crc kubenswrapper[4724]: I1002 13:00:31.878747 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 13:00:31 crc kubenswrapper[4724]: I1002 13:00:31.878758 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T13:00:31Z","lastTransitionTime":"2025-10-02T13:00:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 13:00:31 crc kubenswrapper[4724]: I1002 13:00:31.981067 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 13:00:31 crc kubenswrapper[4724]: I1002 13:00:31.981145 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 13:00:31 crc kubenswrapper[4724]: I1002 13:00:31.981164 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 13:00:31 crc kubenswrapper[4724]: I1002 13:00:31.981188 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 13:00:31 crc kubenswrapper[4724]: I1002 13:00:31.981203 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T13:00:31Z","lastTransitionTime":"2025-10-02T13:00:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 13:00:32 crc kubenswrapper[4724]: I1002 13:00:32.083436 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 13:00:32 crc kubenswrapper[4724]: I1002 13:00:32.083478 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 13:00:32 crc kubenswrapper[4724]: I1002 13:00:32.083489 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 13:00:32 crc kubenswrapper[4724]: I1002 13:00:32.083505 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 13:00:32 crc kubenswrapper[4724]: I1002 13:00:32.083518 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T13:00:32Z","lastTransitionTime":"2025-10-02T13:00:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 13:00:32 crc kubenswrapper[4724]: I1002 13:00:32.185960 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 13:00:32 crc kubenswrapper[4724]: I1002 13:00:32.186012 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 13:00:32 crc kubenswrapper[4724]: I1002 13:00:32.186025 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 13:00:32 crc kubenswrapper[4724]: I1002 13:00:32.186046 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 13:00:32 crc kubenswrapper[4724]: I1002 13:00:32.186061 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T13:00:32Z","lastTransitionTime":"2025-10-02T13:00:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 13:00:32 crc kubenswrapper[4724]: I1002 13:00:32.288709 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 13:00:32 crc kubenswrapper[4724]: I1002 13:00:32.288758 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 13:00:32 crc kubenswrapper[4724]: I1002 13:00:32.288769 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 13:00:32 crc kubenswrapper[4724]: I1002 13:00:32.288785 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 13:00:32 crc kubenswrapper[4724]: I1002 13:00:32.288800 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T13:00:32Z","lastTransitionTime":"2025-10-02T13:00:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 13:00:32 crc kubenswrapper[4724]: I1002 13:00:32.390970 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 13:00:32 crc kubenswrapper[4724]: I1002 13:00:32.391013 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 13:00:32 crc kubenswrapper[4724]: I1002 13:00:32.391021 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 13:00:32 crc kubenswrapper[4724]: I1002 13:00:32.391035 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 13:00:32 crc kubenswrapper[4724]: I1002 13:00:32.391045 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T13:00:32Z","lastTransitionTime":"2025-10-02T13:00:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 13:00:32 crc kubenswrapper[4724]: I1002 13:00:32.493396 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 13:00:32 crc kubenswrapper[4724]: I1002 13:00:32.493454 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 13:00:32 crc kubenswrapper[4724]: I1002 13:00:32.493468 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 13:00:32 crc kubenswrapper[4724]: I1002 13:00:32.493487 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 13:00:32 crc kubenswrapper[4724]: I1002 13:00:32.493497 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T13:00:32Z","lastTransitionTime":"2025-10-02T13:00:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 13:00:32 crc kubenswrapper[4724]: I1002 13:00:32.596299 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 13:00:32 crc kubenswrapper[4724]: I1002 13:00:32.596335 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 13:00:32 crc kubenswrapper[4724]: I1002 13:00:32.596346 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 13:00:32 crc kubenswrapper[4724]: I1002 13:00:32.596370 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 13:00:32 crc kubenswrapper[4724]: I1002 13:00:32.596382 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T13:00:32Z","lastTransitionTime":"2025-10-02T13:00:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 13:00:32 crc kubenswrapper[4724]: I1002 13:00:32.699388 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 13:00:32 crc kubenswrapper[4724]: I1002 13:00:32.699587 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 13:00:32 crc kubenswrapper[4724]: I1002 13:00:32.699602 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 13:00:32 crc kubenswrapper[4724]: I1002 13:00:32.699627 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 13:00:32 crc kubenswrapper[4724]: I1002 13:00:32.699640 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T13:00:32Z","lastTransitionTime":"2025-10-02T13:00:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 13:00:32 crc kubenswrapper[4724]: I1002 13:00:32.810132 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 13:00:32 crc kubenswrapper[4724]: I1002 13:00:32.810193 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 13:00:32 crc kubenswrapper[4724]: I1002 13:00:32.810205 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 13:00:32 crc kubenswrapper[4724]: I1002 13:00:32.810222 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 13:00:32 crc kubenswrapper[4724]: I1002 13:00:32.810233 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T13:00:32Z","lastTransitionTime":"2025-10-02T13:00:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 13:00:32 crc kubenswrapper[4724]: I1002 13:00:32.913091 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 13:00:32 crc kubenswrapper[4724]: I1002 13:00:32.913143 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 13:00:32 crc kubenswrapper[4724]: I1002 13:00:32.913155 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 13:00:32 crc kubenswrapper[4724]: I1002 13:00:32.913171 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 13:00:32 crc kubenswrapper[4724]: I1002 13:00:32.913182 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T13:00:32Z","lastTransitionTime":"2025-10-02T13:00:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 13:00:33 crc kubenswrapper[4724]: I1002 13:00:33.015526 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 13:00:33 crc kubenswrapper[4724]: I1002 13:00:33.015603 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 13:00:33 crc kubenswrapper[4724]: I1002 13:00:33.015616 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 13:00:33 crc kubenswrapper[4724]: I1002 13:00:33.015633 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 13:00:33 crc kubenswrapper[4724]: I1002 13:00:33.015645 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T13:00:33Z","lastTransitionTime":"2025-10-02T13:00:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 13:00:33 crc kubenswrapper[4724]: I1002 13:00:33.119516 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 13:00:33 crc kubenswrapper[4724]: I1002 13:00:33.119584 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 13:00:33 crc kubenswrapper[4724]: I1002 13:00:33.119597 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 13:00:33 crc kubenswrapper[4724]: I1002 13:00:33.119615 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 13:00:33 crc kubenswrapper[4724]: I1002 13:00:33.119627 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T13:00:33Z","lastTransitionTime":"2025-10-02T13:00:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 13:00:33 crc kubenswrapper[4724]: I1002 13:00:33.221702 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 13:00:33 crc kubenswrapper[4724]: I1002 13:00:33.221733 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 13:00:33 crc kubenswrapper[4724]: I1002 13:00:33.221741 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 13:00:33 crc kubenswrapper[4724]: I1002 13:00:33.221755 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 13:00:33 crc kubenswrapper[4724]: I1002 13:00:33.221764 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T13:00:33Z","lastTransitionTime":"2025-10-02T13:00:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 13:00:33 crc kubenswrapper[4724]: I1002 13:00:33.313626 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 13:00:33 crc kubenswrapper[4724]: I1002 13:00:33.313685 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 13:00:33 crc kubenswrapper[4724]: E1002 13:00:33.313761 4724 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 02 13:00:33 crc kubenswrapper[4724]: E1002 13:00:33.313812 4724 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 02 13:00:33 crc kubenswrapper[4724]: I1002 13:00:33.313863 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-q7t2t" Oct 02 13:00:33 crc kubenswrapper[4724]: E1002 13:00:33.313914 4724 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-q7t2t" podUID="32e04071-6b34-4fc0-9783-f346a72fcf99" Oct 02 13:00:33 crc kubenswrapper[4724]: I1002 13:00:33.313944 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 13:00:33 crc kubenswrapper[4724]: E1002 13:00:33.313981 4724 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 02 13:00:33 crc kubenswrapper[4724]: I1002 13:00:33.324575 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 13:00:33 crc kubenswrapper[4724]: I1002 13:00:33.324622 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 13:00:33 crc kubenswrapper[4724]: I1002 13:00:33.324631 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 13:00:33 crc kubenswrapper[4724]: I1002 13:00:33.324648 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 13:00:33 crc kubenswrapper[4724]: I1002 13:00:33.324659 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T13:00:33Z","lastTransitionTime":"2025-10-02T13:00:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 13:00:33 crc kubenswrapper[4724]: I1002 13:00:33.427510 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 13:00:33 crc kubenswrapper[4724]: I1002 13:00:33.427600 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 13:00:33 crc kubenswrapper[4724]: I1002 13:00:33.427619 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 13:00:33 crc kubenswrapper[4724]: I1002 13:00:33.427646 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 13:00:33 crc kubenswrapper[4724]: I1002 13:00:33.427664 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T13:00:33Z","lastTransitionTime":"2025-10-02T13:00:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 13:00:33 crc kubenswrapper[4724]: I1002 13:00:33.530746 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 13:00:33 crc kubenswrapper[4724]: I1002 13:00:33.530798 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 13:00:33 crc kubenswrapper[4724]: I1002 13:00:33.530809 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 13:00:33 crc kubenswrapper[4724]: I1002 13:00:33.530831 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 13:00:33 crc kubenswrapper[4724]: I1002 13:00:33.530843 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T13:00:33Z","lastTransitionTime":"2025-10-02T13:00:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 13:00:33 crc kubenswrapper[4724]: I1002 13:00:33.633498 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 13:00:33 crc kubenswrapper[4724]: I1002 13:00:33.633566 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 13:00:33 crc kubenswrapper[4724]: I1002 13:00:33.633577 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 13:00:33 crc kubenswrapper[4724]: I1002 13:00:33.633591 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 13:00:33 crc kubenswrapper[4724]: I1002 13:00:33.633601 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T13:00:33Z","lastTransitionTime":"2025-10-02T13:00:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 13:00:33 crc kubenswrapper[4724]: I1002 13:00:33.736523 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 13:00:33 crc kubenswrapper[4724]: I1002 13:00:33.736613 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 13:00:33 crc kubenswrapper[4724]: I1002 13:00:33.736625 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 13:00:33 crc kubenswrapper[4724]: I1002 13:00:33.736649 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 13:00:33 crc kubenswrapper[4724]: I1002 13:00:33.736668 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T13:00:33Z","lastTransitionTime":"2025-10-02T13:00:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 13:00:33 crc kubenswrapper[4724]: I1002 13:00:33.838893 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 13:00:33 crc kubenswrapper[4724]: I1002 13:00:33.838930 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 13:00:33 crc kubenswrapper[4724]: I1002 13:00:33.838939 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 13:00:33 crc kubenswrapper[4724]: I1002 13:00:33.838956 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 13:00:33 crc kubenswrapper[4724]: I1002 13:00:33.838969 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T13:00:33Z","lastTransitionTime":"2025-10-02T13:00:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 13:00:33 crc kubenswrapper[4724]: I1002 13:00:33.942070 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 13:00:33 crc kubenswrapper[4724]: I1002 13:00:33.942140 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 13:00:33 crc kubenswrapper[4724]: I1002 13:00:33.942150 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 13:00:33 crc kubenswrapper[4724]: I1002 13:00:33.942172 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 13:00:33 crc kubenswrapper[4724]: I1002 13:00:33.942185 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T13:00:33Z","lastTransitionTime":"2025-10-02T13:00:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 13:00:34 crc kubenswrapper[4724]: I1002 13:00:34.044522 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 13:00:34 crc kubenswrapper[4724]: I1002 13:00:34.044615 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 13:00:34 crc kubenswrapper[4724]: I1002 13:00:34.044628 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 13:00:34 crc kubenswrapper[4724]: I1002 13:00:34.044645 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 13:00:34 crc kubenswrapper[4724]: I1002 13:00:34.044657 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T13:00:34Z","lastTransitionTime":"2025-10-02T13:00:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 13:00:34 crc kubenswrapper[4724]: I1002 13:00:34.147135 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 13:00:34 crc kubenswrapper[4724]: I1002 13:00:34.147171 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 13:00:34 crc kubenswrapper[4724]: I1002 13:00:34.147179 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 13:00:34 crc kubenswrapper[4724]: I1002 13:00:34.147193 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 13:00:34 crc kubenswrapper[4724]: I1002 13:00:34.147203 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T13:00:34Z","lastTransitionTime":"2025-10-02T13:00:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 13:00:34 crc kubenswrapper[4724]: I1002 13:00:34.250727 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 13:00:34 crc kubenswrapper[4724]: I1002 13:00:34.250787 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 13:00:34 crc kubenswrapper[4724]: I1002 13:00:34.250798 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 13:00:34 crc kubenswrapper[4724]: I1002 13:00:34.250823 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 13:00:34 crc kubenswrapper[4724]: I1002 13:00:34.250841 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T13:00:34Z","lastTransitionTime":"2025-10-02T13:00:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 13:00:34 crc kubenswrapper[4724]: I1002 13:00:34.353447 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 13:00:34 crc kubenswrapper[4724]: I1002 13:00:34.353500 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 13:00:34 crc kubenswrapper[4724]: I1002 13:00:34.353511 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 13:00:34 crc kubenswrapper[4724]: I1002 13:00:34.353528 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 13:00:34 crc kubenswrapper[4724]: I1002 13:00:34.353570 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T13:00:34Z","lastTransitionTime":"2025-10-02T13:00:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 13:00:34 crc kubenswrapper[4724]: I1002 13:00:34.456283 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 13:00:34 crc kubenswrapper[4724]: I1002 13:00:34.456327 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 13:00:34 crc kubenswrapper[4724]: I1002 13:00:34.456337 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 13:00:34 crc kubenswrapper[4724]: I1002 13:00:34.456353 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 13:00:34 crc kubenswrapper[4724]: I1002 13:00:34.456364 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T13:00:34Z","lastTransitionTime":"2025-10-02T13:00:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 13:00:34 crc kubenswrapper[4724]: I1002 13:00:34.558832 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 13:00:34 crc kubenswrapper[4724]: I1002 13:00:34.558874 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 13:00:34 crc kubenswrapper[4724]: I1002 13:00:34.558883 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 13:00:34 crc kubenswrapper[4724]: I1002 13:00:34.558897 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 13:00:34 crc kubenswrapper[4724]: I1002 13:00:34.558906 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T13:00:34Z","lastTransitionTime":"2025-10-02T13:00:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 13:00:34 crc kubenswrapper[4724]: I1002 13:00:34.661735 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 13:00:34 crc kubenswrapper[4724]: I1002 13:00:34.661772 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 13:00:34 crc kubenswrapper[4724]: I1002 13:00:34.661780 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 13:00:34 crc kubenswrapper[4724]: I1002 13:00:34.661798 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 13:00:34 crc kubenswrapper[4724]: I1002 13:00:34.661809 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T13:00:34Z","lastTransitionTime":"2025-10-02T13:00:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 13:00:34 crc kubenswrapper[4724]: I1002 13:00:34.764551 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 13:00:34 crc kubenswrapper[4724]: I1002 13:00:34.764605 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 13:00:34 crc kubenswrapper[4724]: I1002 13:00:34.764618 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 13:00:34 crc kubenswrapper[4724]: I1002 13:00:34.764634 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 13:00:34 crc kubenswrapper[4724]: I1002 13:00:34.764646 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T13:00:34Z","lastTransitionTime":"2025-10-02T13:00:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 13:00:34 crc kubenswrapper[4724]: I1002 13:00:34.867320 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 13:00:34 crc kubenswrapper[4724]: I1002 13:00:34.867373 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 13:00:34 crc kubenswrapper[4724]: I1002 13:00:34.867383 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 13:00:34 crc kubenswrapper[4724]: I1002 13:00:34.867399 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 13:00:34 crc kubenswrapper[4724]: I1002 13:00:34.867408 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T13:00:34Z","lastTransitionTime":"2025-10-02T13:00:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 13:00:34 crc kubenswrapper[4724]: I1002 13:00:34.970312 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 13:00:34 crc kubenswrapper[4724]: I1002 13:00:34.970362 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 13:00:34 crc kubenswrapper[4724]: I1002 13:00:34.970373 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 13:00:34 crc kubenswrapper[4724]: I1002 13:00:34.970388 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 13:00:34 crc kubenswrapper[4724]: I1002 13:00:34.970397 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T13:00:34Z","lastTransitionTime":"2025-10-02T13:00:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 13:00:35 crc kubenswrapper[4724]: I1002 13:00:35.073880 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 13:00:35 crc kubenswrapper[4724]: I1002 13:00:35.073930 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 13:00:35 crc kubenswrapper[4724]: I1002 13:00:35.073942 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 13:00:35 crc kubenswrapper[4724]: I1002 13:00:35.073959 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 13:00:35 crc kubenswrapper[4724]: I1002 13:00:35.073971 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T13:00:35Z","lastTransitionTime":"2025-10-02T13:00:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 13:00:35 crc kubenswrapper[4724]: I1002 13:00:35.176873 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 13:00:35 crc kubenswrapper[4724]: I1002 13:00:35.176908 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 13:00:35 crc kubenswrapper[4724]: I1002 13:00:35.176917 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 13:00:35 crc kubenswrapper[4724]: I1002 13:00:35.176931 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 13:00:35 crc kubenswrapper[4724]: I1002 13:00:35.176939 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T13:00:35Z","lastTransitionTime":"2025-10-02T13:00:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 13:00:35 crc kubenswrapper[4724]: I1002 13:00:35.279193 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 13:00:35 crc kubenswrapper[4724]: I1002 13:00:35.279232 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 13:00:35 crc kubenswrapper[4724]: I1002 13:00:35.279239 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 13:00:35 crc kubenswrapper[4724]: I1002 13:00:35.279253 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 13:00:35 crc kubenswrapper[4724]: I1002 13:00:35.279263 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T13:00:35Z","lastTransitionTime":"2025-10-02T13:00:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 13:00:35 crc kubenswrapper[4724]: I1002 13:00:35.312826 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 13:00:35 crc kubenswrapper[4724]: I1002 13:00:35.312871 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 13:00:35 crc kubenswrapper[4724]: E1002 13:00:35.312932 4724 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 02 13:00:35 crc kubenswrapper[4724]: I1002 13:00:35.313004 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-q7t2t" Oct 02 13:00:35 crc kubenswrapper[4724]: I1002 13:00:35.313034 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 13:00:35 crc kubenswrapper[4724]: E1002 13:00:35.313160 4724 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 02 13:00:35 crc kubenswrapper[4724]: E1002 13:00:35.313269 4724 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-q7t2t" podUID="32e04071-6b34-4fc0-9783-f346a72fcf99" Oct 02 13:00:35 crc kubenswrapper[4724]: E1002 13:00:35.313552 4724 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 02 13:00:35 crc kubenswrapper[4724]: I1002 13:00:35.381243 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 13:00:35 crc kubenswrapper[4724]: I1002 13:00:35.381282 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 13:00:35 crc kubenswrapper[4724]: I1002 13:00:35.381290 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 13:00:35 crc kubenswrapper[4724]: I1002 13:00:35.381306 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 13:00:35 crc kubenswrapper[4724]: I1002 13:00:35.381316 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T13:00:35Z","lastTransitionTime":"2025-10-02T13:00:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 13:00:35 crc kubenswrapper[4724]: I1002 13:00:35.483604 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 13:00:35 crc kubenswrapper[4724]: I1002 13:00:35.483653 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 13:00:35 crc kubenswrapper[4724]: I1002 13:00:35.483663 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 13:00:35 crc kubenswrapper[4724]: I1002 13:00:35.483675 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 13:00:35 crc kubenswrapper[4724]: I1002 13:00:35.483684 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T13:00:35Z","lastTransitionTime":"2025-10-02T13:00:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 13:00:35 crc kubenswrapper[4724]: I1002 13:00:35.585227 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 13:00:35 crc kubenswrapper[4724]: I1002 13:00:35.585271 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 13:00:35 crc kubenswrapper[4724]: I1002 13:00:35.585281 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 13:00:35 crc kubenswrapper[4724]: I1002 13:00:35.585296 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 13:00:35 crc kubenswrapper[4724]: I1002 13:00:35.585306 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T13:00:35Z","lastTransitionTime":"2025-10-02T13:00:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 13:00:35 crc kubenswrapper[4724]: I1002 13:00:35.687744 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 13:00:35 crc kubenswrapper[4724]: I1002 13:00:35.687783 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 13:00:35 crc kubenswrapper[4724]: I1002 13:00:35.687792 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 13:00:35 crc kubenswrapper[4724]: I1002 13:00:35.687807 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 13:00:35 crc kubenswrapper[4724]: I1002 13:00:35.687819 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T13:00:35Z","lastTransitionTime":"2025-10-02T13:00:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 13:00:35 crc kubenswrapper[4724]: I1002 13:00:35.790036 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 13:00:35 crc kubenswrapper[4724]: I1002 13:00:35.790096 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 13:00:35 crc kubenswrapper[4724]: I1002 13:00:35.790105 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 13:00:35 crc kubenswrapper[4724]: I1002 13:00:35.790122 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 13:00:35 crc kubenswrapper[4724]: I1002 13:00:35.790133 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T13:00:35Z","lastTransitionTime":"2025-10-02T13:00:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 13:00:35 crc kubenswrapper[4724]: I1002 13:00:35.892605 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 13:00:35 crc kubenswrapper[4724]: I1002 13:00:35.892680 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 13:00:35 crc kubenswrapper[4724]: I1002 13:00:35.892691 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 13:00:35 crc kubenswrapper[4724]: I1002 13:00:35.892711 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 13:00:35 crc kubenswrapper[4724]: I1002 13:00:35.892730 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T13:00:35Z","lastTransitionTime":"2025-10-02T13:00:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 13:00:35 crc kubenswrapper[4724]: I1002 13:00:35.995944 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 13:00:35 crc kubenswrapper[4724]: I1002 13:00:35.995976 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 13:00:35 crc kubenswrapper[4724]: I1002 13:00:35.995985 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 13:00:35 crc kubenswrapper[4724]: I1002 13:00:35.996000 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 13:00:35 crc kubenswrapper[4724]: I1002 13:00:35.996009 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T13:00:35Z","lastTransitionTime":"2025-10-02T13:00:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 13:00:36 crc kubenswrapper[4724]: I1002 13:00:36.099317 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 13:00:36 crc kubenswrapper[4724]: I1002 13:00:36.099360 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 13:00:36 crc kubenswrapper[4724]: I1002 13:00:36.099371 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 13:00:36 crc kubenswrapper[4724]: I1002 13:00:36.099388 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 13:00:36 crc kubenswrapper[4724]: I1002 13:00:36.099402 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T13:00:36Z","lastTransitionTime":"2025-10-02T13:00:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 13:00:36 crc kubenswrapper[4724]: I1002 13:00:36.206338 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 13:00:36 crc kubenswrapper[4724]: I1002 13:00:36.206409 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 13:00:36 crc kubenswrapper[4724]: I1002 13:00:36.206424 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 13:00:36 crc kubenswrapper[4724]: I1002 13:00:36.206453 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 13:00:36 crc kubenswrapper[4724]: I1002 13:00:36.206471 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T13:00:36Z","lastTransitionTime":"2025-10-02T13:00:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 13:00:36 crc kubenswrapper[4724]: I1002 13:00:36.309511 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 13:00:36 crc kubenswrapper[4724]: I1002 13:00:36.309562 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 13:00:36 crc kubenswrapper[4724]: I1002 13:00:36.309572 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 13:00:36 crc kubenswrapper[4724]: I1002 13:00:36.309585 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 13:00:36 crc kubenswrapper[4724]: I1002 13:00:36.309594 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T13:00:36Z","lastTransitionTime":"2025-10-02T13:00:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 13:00:36 crc kubenswrapper[4724]: I1002 13:00:36.331708 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"92a0e520-2ed8-4d81-8de4-c0f98916b012\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:58:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:58:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:58:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8d0d24fc1f56963b40bca48e3e923a5b35f9bec48f0b4fabbbdc9163762b2aa6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:58:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cdf23aa5cabd12ba0af299a7184df0341b5a7f7fb6f8159a10c932599fcef08b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:58:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a087f1c52a561564ac405ef652e8cb1542077e507acff22e1189f7370b7ca322\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:58:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a675aca2218d07044b6e9a391f2b60f5cc822790011af97b0c0b704c1bd98d78\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:58:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T12:58:56Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T13:00:36Z is after 2025-08-24T17:21:41Z" Oct 02 13:00:36 crc kubenswrapper[4724]: I1002 13:00:36.345209 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-2mrjk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e0d209f5-ad55-48f5-b4de-51aa5a972c19\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e94572c741d70917793b2482faf4c7fba57dfc4a29ade8824b35109735b602c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8f48\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T12:59:21Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-2mrjk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T13:00:36Z is after 2025-08-24T17:21:41Z" Oct 02 13:00:36 crc kubenswrapper[4724]: I1002 13:00:36.361930 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0ad92a13-94bc-463f-bdba-e4e6c7e97e13\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:58:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:58:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:58:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:58:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:58:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f77fba67120d9373972da96b5e1bffe2fbe5e928e877cac05d409feb94638952\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:58:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0a7cab088cdfb74745e054222be93aa9d7639f5d200570d867608b2e04f6a0e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0a7cab088cdfb74745e054222be93aa9d7639f5d200570d867608b2e04f6a0e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T12:58:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T12:58:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T12:58:56Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T13:00:36Z is after 2025-08-24T17:21:41Z" Oct 02 13:00:36 crc kubenswrapper[4724]: I1002 13:00:36.377428 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df76441bbfa05afa242a77e10e6f855d811a9432b790630a9fa5f359ccb6d457\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T13:00:36Z is after 2025-08-24T17:21:41Z" Oct 02 13:00:36 crc kubenswrapper[4724]: I1002 13:00:36.392730 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://16ca55ea7e67fb6219b1c835d960d1f9ec6a18a98789ee961e44a278fad9add7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://72eefb1c1aad15d7ad34aa4384c7d0a20a4c886225cb7d28f0597cf01ad35693\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T13:00:36Z is after 2025-08-24T17:21:41Z" Oct 02 13:00:36 crc kubenswrapper[4724]: I1002 13:00:36.408391 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:19Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T13:00:36Z is after 2025-08-24T17:21:41Z" Oct 02 13:00:36 crc kubenswrapper[4724]: I1002 13:00:36.412665 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 13:00:36 crc kubenswrapper[4724]: I1002 13:00:36.412708 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 13:00:36 crc kubenswrapper[4724]: I1002 13:00:36.412717 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 13:00:36 crc kubenswrapper[4724]: I1002 13:00:36.412736 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 13:00:36 crc kubenswrapper[4724]: I1002 13:00:36.412747 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T13:00:36Z","lastTransitionTime":"2025-10-02T13:00:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 13:00:36 crc kubenswrapper[4724]: I1002 13:00:36.426437 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-829dv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b90b9e02-3565-4ad5-8f8c-eec339fc499c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://12d1d4558bef1a0a974bf27f9547c646e216ed5001227ab980e89b05c2a7b5ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vjwcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://55a18158dd67ef4667e4a6bc9684c4f4824f848b6e151a0f99c73e5ecfcc5ab1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://55a18158dd67ef4667e4a6bc9684c4f4824f848b6e151a0f99c73e5ecfcc5ab1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T12:59:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T12:59:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vjwcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://db711940b17fde8f8a00b44de8366688eb730facc06a174bf09456c12c673eb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://db711940b17fde8f8a00b44de8366688eb730facc06a174bf09456c12c673eb0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T12:59:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T12:59:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vjwcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a04ebb8ce267454f3068833092d9644e22484d6d3ea6c89e20fe396e5ea1fe9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a04ebb8ce267454f3068833092d9644e22484d6d3ea6c89e20fe396e5ea1fe9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T12:59:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T12:59:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vjwcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://771dcee276aac3571cff4769668348ea4a10d51e792d1604a47c57edac172dec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://771dcee276aac3571cff4769668348ea4a10d51e792d1604a47c57edac172dec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T12:59:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T12:59:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vjwcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c8b3f0e84cfea43942d32d84b09db7348002648527ea7d65a716489f8507eeb5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c8b3f0e84cfea43942d32d84b09db7348002648527ea7d65a716489f8507eeb5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T12:59:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T12:59:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vjwcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1271c0f5d096f2fc134e178807b44e3a5980ca67d64f0e0d73f566b348206a55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1271c0f5d096f2fc134e178807b44e3a5980ca67d64f0e0d73f566b348206a55\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T12:59:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T12:59:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vjwcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T12:59:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-829dv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T13:00:36Z is after 2025-08-24T17:21:41Z" Oct 02 13:00:36 crc kubenswrapper[4724]: I1002 13:00:36.441478 4724 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-74k4t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6090eaa-c182-4788-950c-16352c271233\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T12:59:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a3396cd890f4ef8af351a4a3065c2324c20bddc7e079bb78e5a02e9fc8269c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fb6lr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4d5dbca156adf53ca2d3b237e15b6dd4387249e73dbf1a7fb3e7e39c53d1b548\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T12:59:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fb6lr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T12:59:21Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-74k4t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T13:00:36Z is after 2025-08-24T17:21:41Z" Oct 02 13:00:36 crc kubenswrapper[4724]: I1002 13:00:36.483074 4724 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" podStartSLOduration=50.483049583 podStartE2EDuration="50.483049583s" podCreationTimestamp="2025-10-02 12:59:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 13:00:36.482665863 +0000 UTC m=+100.937424984" watchObservedRunningTime="2025-10-02 13:00:36.483049583 +0000 UTC m=+100.937808704" Oct 02 13:00:36 crc kubenswrapper[4724]: I1002 13:00:36.517696 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 13:00:36 crc kubenswrapper[4724]: I1002 13:00:36.517735 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 13:00:36 crc kubenswrapper[4724]: I1002 13:00:36.517745 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 13:00:36 crc kubenswrapper[4724]: I1002 13:00:36.517793 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 13:00:36 crc kubenswrapper[4724]: I1002 13:00:36.517805 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T13:00:36Z","lastTransitionTime":"2025-10-02T13:00:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 13:00:36 crc kubenswrapper[4724]: I1002 13:00:36.535024 4724 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=77.534995318 podStartE2EDuration="1m17.534995318s" podCreationTimestamp="2025-10-02 12:59:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 13:00:36.534727151 +0000 UTC m=+100.989486302" watchObservedRunningTime="2025-10-02 13:00:36.534995318 +0000 UTC m=+100.989754439" Oct 02 13:00:36 crc kubenswrapper[4724]: I1002 13:00:36.591230 4724 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-pr276" podStartSLOduration=76.591213233 podStartE2EDuration="1m16.591213233s" podCreationTimestamp="2025-10-02 12:59:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 13:00:36.590922495 +0000 UTC m=+101.045681626" watchObservedRunningTime="2025-10-02 13:00:36.591213233 +0000 UTC m=+101.045972354" Oct 02 13:00:36 crc kubenswrapper[4724]: I1002 13:00:36.603568 4724 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-vv7gr" podStartSLOduration=76.603519437 podStartE2EDuration="1m16.603519437s" podCreationTimestamp="2025-10-02 12:59:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 13:00:36.603071135 +0000 UTC m=+101.057830266" watchObservedRunningTime="2025-10-02 13:00:36.603519437 +0000 UTC m=+101.058278568" Oct 02 13:00:36 crc kubenswrapper[4724]: I1002 13:00:36.615750 4724 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-z9692" podStartSLOduration=75.615728658 podStartE2EDuration="1m15.615728658s" podCreationTimestamp="2025-10-02 12:59:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 13:00:36.615215185 +0000 UTC m=+101.069974306" watchObservedRunningTime="2025-10-02 13:00:36.615728658 +0000 UTC m=+101.070487779" Oct 02 13:00:36 crc kubenswrapper[4724]: I1002 13:00:36.620671 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 13:00:36 crc kubenswrapper[4724]: I1002 13:00:36.620706 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 13:00:36 crc kubenswrapper[4724]: I1002 13:00:36.620717 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 13:00:36 crc kubenswrapper[4724]: I1002 13:00:36.620733 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 13:00:36 crc kubenswrapper[4724]: I1002 13:00:36.620746 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T13:00:36Z","lastTransitionTime":"2025-10-02T13:00:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 13:00:36 crc kubenswrapper[4724]: I1002 13:00:36.722333 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 13:00:36 crc kubenswrapper[4724]: I1002 13:00:36.722365 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 13:00:36 crc kubenswrapper[4724]: I1002 13:00:36.722412 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 13:00:36 crc kubenswrapper[4724]: I1002 13:00:36.722436 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 13:00:36 crc kubenswrapper[4724]: I1002 13:00:36.722448 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T13:00:36Z","lastTransitionTime":"2025-10-02T13:00:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 13:00:36 crc kubenswrapper[4724]: I1002 13:00:36.825127 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 13:00:36 crc kubenswrapper[4724]: I1002 13:00:36.825164 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 13:00:36 crc kubenswrapper[4724]: I1002 13:00:36.825172 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 13:00:36 crc kubenswrapper[4724]: I1002 13:00:36.825187 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 13:00:36 crc kubenswrapper[4724]: I1002 13:00:36.825195 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T13:00:36Z","lastTransitionTime":"2025-10-02T13:00:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 13:00:36 crc kubenswrapper[4724]: I1002 13:00:36.928114 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 13:00:36 crc kubenswrapper[4724]: I1002 13:00:36.928160 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 13:00:36 crc kubenswrapper[4724]: I1002 13:00:36.928176 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 13:00:36 crc kubenswrapper[4724]: I1002 13:00:36.928194 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 13:00:36 crc kubenswrapper[4724]: I1002 13:00:36.928204 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T13:00:36Z","lastTransitionTime":"2025-10-02T13:00:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 13:00:37 crc kubenswrapper[4724]: I1002 13:00:37.030802 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 13:00:37 crc kubenswrapper[4724]: I1002 13:00:37.031057 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 13:00:37 crc kubenswrapper[4724]: I1002 13:00:37.031153 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 13:00:37 crc kubenswrapper[4724]: I1002 13:00:37.031268 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 13:00:37 crc kubenswrapper[4724]: I1002 13:00:37.031348 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T13:00:37Z","lastTransitionTime":"2025-10-02T13:00:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 13:00:37 crc kubenswrapper[4724]: I1002 13:00:37.133206 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 13:00:37 crc kubenswrapper[4724]: I1002 13:00:37.133265 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 13:00:37 crc kubenswrapper[4724]: I1002 13:00:37.133274 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 13:00:37 crc kubenswrapper[4724]: I1002 13:00:37.133289 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 13:00:37 crc kubenswrapper[4724]: I1002 13:00:37.133298 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T13:00:37Z","lastTransitionTime":"2025-10-02T13:00:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 13:00:37 crc kubenswrapper[4724]: I1002 13:00:37.235580 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 13:00:37 crc kubenswrapper[4724]: I1002 13:00:37.235618 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 13:00:37 crc kubenswrapper[4724]: I1002 13:00:37.235626 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 13:00:37 crc kubenswrapper[4724]: I1002 13:00:37.235642 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 13:00:37 crc kubenswrapper[4724]: I1002 13:00:37.235652 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T13:00:37Z","lastTransitionTime":"2025-10-02T13:00:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 13:00:37 crc kubenswrapper[4724]: I1002 13:00:37.312808 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-q7t2t" Oct 02 13:00:37 crc kubenswrapper[4724]: I1002 13:00:37.312899 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 13:00:37 crc kubenswrapper[4724]: E1002 13:00:37.312981 4724 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-q7t2t" podUID="32e04071-6b34-4fc0-9783-f346a72fcf99" Oct 02 13:00:37 crc kubenswrapper[4724]: I1002 13:00:37.313009 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 13:00:37 crc kubenswrapper[4724]: E1002 13:00:37.313048 4724 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 02 13:00:37 crc kubenswrapper[4724]: E1002 13:00:37.313151 4724 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 02 13:00:37 crc kubenswrapper[4724]: I1002 13:00:37.312843 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 13:00:37 crc kubenswrapper[4724]: E1002 13:00:37.313254 4724 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 02 13:00:37 crc kubenswrapper[4724]: I1002 13:00:37.338028 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 13:00:37 crc kubenswrapper[4724]: I1002 13:00:37.338091 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 13:00:37 crc kubenswrapper[4724]: I1002 13:00:37.338103 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 13:00:37 crc kubenswrapper[4724]: I1002 13:00:37.338119 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 13:00:37 crc kubenswrapper[4724]: I1002 13:00:37.338127 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T13:00:37Z","lastTransitionTime":"2025-10-02T13:00:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 13:00:37 crc kubenswrapper[4724]: I1002 13:00:37.440468 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 13:00:37 crc kubenswrapper[4724]: I1002 13:00:37.440508 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 13:00:37 crc kubenswrapper[4724]: I1002 13:00:37.440519 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 13:00:37 crc kubenswrapper[4724]: I1002 13:00:37.440549 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 13:00:37 crc kubenswrapper[4724]: I1002 13:00:37.440560 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T13:00:37Z","lastTransitionTime":"2025-10-02T13:00:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 13:00:37 crc kubenswrapper[4724]: I1002 13:00:37.542523 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 13:00:37 crc kubenswrapper[4724]: I1002 13:00:37.542584 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 13:00:37 crc kubenswrapper[4724]: I1002 13:00:37.542595 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 13:00:37 crc kubenswrapper[4724]: I1002 13:00:37.542612 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 13:00:37 crc kubenswrapper[4724]: I1002 13:00:37.542624 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T13:00:37Z","lastTransitionTime":"2025-10-02T13:00:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 13:00:37 crc kubenswrapper[4724]: I1002 13:00:37.645249 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 13:00:37 crc kubenswrapper[4724]: I1002 13:00:37.645282 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 13:00:37 crc kubenswrapper[4724]: I1002 13:00:37.645290 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 13:00:37 crc kubenswrapper[4724]: I1002 13:00:37.645304 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 13:00:37 crc kubenswrapper[4724]: I1002 13:00:37.645313 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T13:00:37Z","lastTransitionTime":"2025-10-02T13:00:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 13:00:37 crc kubenswrapper[4724]: I1002 13:00:37.747767 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 13:00:37 crc kubenswrapper[4724]: I1002 13:00:37.747807 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 13:00:37 crc kubenswrapper[4724]: I1002 13:00:37.747818 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 13:00:37 crc kubenswrapper[4724]: I1002 13:00:37.747833 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 13:00:37 crc kubenswrapper[4724]: I1002 13:00:37.747842 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T13:00:37Z","lastTransitionTime":"2025-10-02T13:00:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 13:00:37 crc kubenswrapper[4724]: I1002 13:00:37.849717 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 13:00:37 crc kubenswrapper[4724]: I1002 13:00:37.849775 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 13:00:37 crc kubenswrapper[4724]: I1002 13:00:37.849785 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 13:00:37 crc kubenswrapper[4724]: I1002 13:00:37.849800 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 13:00:37 crc kubenswrapper[4724]: I1002 13:00:37.849809 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T13:00:37Z","lastTransitionTime":"2025-10-02T13:00:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 13:00:37 crc kubenswrapper[4724]: I1002 13:00:37.944305 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 13:00:37 crc kubenswrapper[4724]: I1002 13:00:37.944337 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 13:00:37 crc kubenswrapper[4724]: I1002 13:00:37.944352 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 13:00:37 crc kubenswrapper[4724]: I1002 13:00:37.944370 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 13:00:37 crc kubenswrapper[4724]: I1002 13:00:37.944382 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T13:00:37Z","lastTransitionTime":"2025-10-02T13:00:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 13:00:37 crc kubenswrapper[4724]: I1002 13:00:37.958329 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 13:00:37 crc kubenswrapper[4724]: I1002 13:00:37.958364 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 13:00:37 crc kubenswrapper[4724]: I1002 13:00:37.958374 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 13:00:37 crc kubenswrapper[4724]: I1002 13:00:37.958389 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 13:00:37 crc kubenswrapper[4724]: I1002 13:00:37.958400 4724 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T13:00:37Z","lastTransitionTime":"2025-10-02T13:00:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 13:00:37 crc kubenswrapper[4724]: I1002 13:00:37.981900 4724 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-version/cluster-version-operator-5c965bbfc6-cwxmn"] Oct 02 13:00:37 crc kubenswrapper[4724]: I1002 13:00:37.982750 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-cwxmn" Oct 02 13:00:37 crc kubenswrapper[4724]: I1002 13:00:37.984314 4724 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Oct 02 13:00:37 crc kubenswrapper[4724]: I1002 13:00:37.984895 4724 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Oct 02 13:00:37 crc kubenswrapper[4724]: I1002 13:00:37.985129 4724 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Oct 02 13:00:37 crc kubenswrapper[4724]: I1002 13:00:37.985197 4724 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Oct 02 13:00:37 crc kubenswrapper[4724]: I1002 13:00:37.998746 4724 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" podStartSLOduration=21.99873193 podStartE2EDuration="21.99873193s" podCreationTimestamp="2025-10-02 13:00:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 13:00:37.997870818 +0000 UTC m=+102.452629939" watchObservedRunningTime="2025-10-02 13:00:37.99873193 +0000 UTC m=+102.453491051" Oct 02 13:00:38 crc kubenswrapper[4724]: I1002 13:00:38.011085 4724 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podStartSLOduration=75.011064295 podStartE2EDuration="1m15.011064295s" podCreationTimestamp="2025-10-02 12:59:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 13:00:38.010735427 +0000 UTC m=+102.465494558" watchObservedRunningTime="2025-10-02 13:00:38.011064295 +0000 UTC m=+102.465823416" Oct 02 13:00:38 crc kubenswrapper[4724]: I1002 13:00:38.034221 4724 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-2mrjk" podStartSLOduration=78.034199685 podStartE2EDuration="1m18.034199685s" podCreationTimestamp="2025-10-02 12:59:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 13:00:38.021790149 +0000 UTC m=+102.476549270" watchObservedRunningTime="2025-10-02 13:00:38.034199685 +0000 UTC m=+102.488958806" Oct 02 13:00:38 crc kubenswrapper[4724]: I1002 13:00:38.049348 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/e37720fc-c80f-4729-91e0-03b9f1dc9e80-service-ca\") pod \"cluster-version-operator-5c965bbfc6-cwxmn\" (UID: \"e37720fc-c80f-4729-91e0-03b9f1dc9e80\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-cwxmn" Oct 02 13:00:38 crc kubenswrapper[4724]: I1002 13:00:38.049411 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/e37720fc-c80f-4729-91e0-03b9f1dc9e80-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-cwxmn\" (UID: \"e37720fc-c80f-4729-91e0-03b9f1dc9e80\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-cwxmn" Oct 02 13:00:38 crc kubenswrapper[4724]: I1002 13:00:38.049435 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/e37720fc-c80f-4729-91e0-03b9f1dc9e80-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-cwxmn\" (UID: \"e37720fc-c80f-4729-91e0-03b9f1dc9e80\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-cwxmn" Oct 02 13:00:38 crc kubenswrapper[4724]: I1002 13:00:38.049478 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e37720fc-c80f-4729-91e0-03b9f1dc9e80-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-cwxmn\" (UID: \"e37720fc-c80f-4729-91e0-03b9f1dc9e80\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-cwxmn" Oct 02 13:00:38 crc kubenswrapper[4724]: I1002 13:00:38.049509 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e37720fc-c80f-4729-91e0-03b9f1dc9e80-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-cwxmn\" (UID: \"e37720fc-c80f-4729-91e0-03b9f1dc9e80\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-cwxmn" Oct 02 13:00:38 crc kubenswrapper[4724]: I1002 13:00:38.108133 4724 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-829dv" podStartSLOduration=78.108109091 podStartE2EDuration="1m18.108109091s" podCreationTimestamp="2025-10-02 12:59:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 13:00:38.0966874 +0000 UTC m=+102.551446551" watchObservedRunningTime="2025-10-02 13:00:38.108109091 +0000 UTC m=+102.562868212" Oct 02 13:00:38 crc kubenswrapper[4724]: I1002 13:00:38.108863 4724 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-daemon-74k4t" podStartSLOduration=78.108857351 podStartE2EDuration="1m18.108857351s" podCreationTimestamp="2025-10-02 12:59:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 13:00:38.108235695 +0000 UTC m=+102.562994826" watchObservedRunningTime="2025-10-02 13:00:38.108857351 +0000 UTC m=+102.563616472" Oct 02 13:00:38 crc kubenswrapper[4724]: I1002 13:00:38.150284 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e37720fc-c80f-4729-91e0-03b9f1dc9e80-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-cwxmn\" (UID: \"e37720fc-c80f-4729-91e0-03b9f1dc9e80\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-cwxmn" Oct 02 13:00:38 crc kubenswrapper[4724]: I1002 13:00:38.150341 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e37720fc-c80f-4729-91e0-03b9f1dc9e80-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-cwxmn\" (UID: \"e37720fc-c80f-4729-91e0-03b9f1dc9e80\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-cwxmn" Oct 02 13:00:38 crc kubenswrapper[4724]: I1002 13:00:38.150410 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/e37720fc-c80f-4729-91e0-03b9f1dc9e80-service-ca\") pod \"cluster-version-operator-5c965bbfc6-cwxmn\" (UID: \"e37720fc-c80f-4729-91e0-03b9f1dc9e80\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-cwxmn" Oct 02 13:00:38 crc kubenswrapper[4724]: I1002 13:00:38.150440 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/e37720fc-c80f-4729-91e0-03b9f1dc9e80-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-cwxmn\" (UID: \"e37720fc-c80f-4729-91e0-03b9f1dc9e80\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-cwxmn" Oct 02 13:00:38 crc kubenswrapper[4724]: I1002 13:00:38.150460 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/e37720fc-c80f-4729-91e0-03b9f1dc9e80-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-cwxmn\" (UID: \"e37720fc-c80f-4729-91e0-03b9f1dc9e80\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-cwxmn" Oct 02 13:00:38 crc kubenswrapper[4724]: I1002 13:00:38.150550 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/e37720fc-c80f-4729-91e0-03b9f1dc9e80-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-cwxmn\" (UID: \"e37720fc-c80f-4729-91e0-03b9f1dc9e80\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-cwxmn" Oct 02 13:00:38 crc kubenswrapper[4724]: I1002 13:00:38.150617 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/e37720fc-c80f-4729-91e0-03b9f1dc9e80-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-cwxmn\" (UID: \"e37720fc-c80f-4729-91e0-03b9f1dc9e80\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-cwxmn" Oct 02 13:00:38 crc kubenswrapper[4724]: I1002 13:00:38.151474 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/e37720fc-c80f-4729-91e0-03b9f1dc9e80-service-ca\") pod \"cluster-version-operator-5c965bbfc6-cwxmn\" (UID: \"e37720fc-c80f-4729-91e0-03b9f1dc9e80\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-cwxmn" Oct 02 13:00:38 crc kubenswrapper[4724]: I1002 13:00:38.156459 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e37720fc-c80f-4729-91e0-03b9f1dc9e80-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-cwxmn\" (UID: \"e37720fc-c80f-4729-91e0-03b9f1dc9e80\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-cwxmn" Oct 02 13:00:38 crc kubenswrapper[4724]: I1002 13:00:38.174055 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e37720fc-c80f-4729-91e0-03b9f1dc9e80-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-cwxmn\" (UID: \"e37720fc-c80f-4729-91e0-03b9f1dc9e80\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-cwxmn" Oct 02 13:00:38 crc kubenswrapper[4724]: I1002 13:00:38.296392 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-cwxmn" Oct 02 13:00:38 crc kubenswrapper[4724]: I1002 13:00:38.940921 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-cwxmn" event={"ID":"e37720fc-c80f-4729-91e0-03b9f1dc9e80","Type":"ContainerStarted","Data":"a30f4ce0d74d767b0e9c06cc5dc2cb860d383e14b4c40ef69b756e30d17bce79"} Oct 02 13:00:38 crc kubenswrapper[4724]: I1002 13:00:38.941213 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-cwxmn" event={"ID":"e37720fc-c80f-4729-91e0-03b9f1dc9e80","Type":"ContainerStarted","Data":"1e7542ce9a04d5be7749ea16835019e49a565480135f66604d90b5c7b8c4fc1d"} Oct 02 13:00:39 crc kubenswrapper[4724]: I1002 13:00:39.262005 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/32e04071-6b34-4fc0-9783-f346a72fcf99-metrics-certs\") pod \"network-metrics-daemon-q7t2t\" (UID: \"32e04071-6b34-4fc0-9783-f346a72fcf99\") " pod="openshift-multus/network-metrics-daemon-q7t2t" Oct 02 13:00:39 crc kubenswrapper[4724]: E1002 13:00:39.262152 4724 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Oct 02 13:00:39 crc kubenswrapper[4724]: E1002 13:00:39.262237 4724 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/32e04071-6b34-4fc0-9783-f346a72fcf99-metrics-certs podName:32e04071-6b34-4fc0-9783-f346a72fcf99 nodeName:}" failed. No retries permitted until 2025-10-02 13:01:43.262217782 +0000 UTC m=+167.716976903 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/32e04071-6b34-4fc0-9783-f346a72fcf99-metrics-certs") pod "network-metrics-daemon-q7t2t" (UID: "32e04071-6b34-4fc0-9783-f346a72fcf99") : object "openshift-multus"/"metrics-daemon-secret" not registered Oct 02 13:00:39 crc kubenswrapper[4724]: I1002 13:00:39.312756 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 13:00:39 crc kubenswrapper[4724]: E1002 13:00:39.312897 4724 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 02 13:00:39 crc kubenswrapper[4724]: I1002 13:00:39.313103 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-q7t2t" Oct 02 13:00:39 crc kubenswrapper[4724]: E1002 13:00:39.313167 4724 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-q7t2t" podUID="32e04071-6b34-4fc0-9783-f346a72fcf99" Oct 02 13:00:39 crc kubenswrapper[4724]: I1002 13:00:39.313290 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 13:00:39 crc kubenswrapper[4724]: E1002 13:00:39.313343 4724 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 02 13:00:39 crc kubenswrapper[4724]: I1002 13:00:39.313442 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 13:00:39 crc kubenswrapper[4724]: E1002 13:00:39.313490 4724 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 02 13:00:39 crc kubenswrapper[4724]: I1002 13:00:39.314350 4724 scope.go:117] "RemoveContainer" containerID="6c6b3b7fc0c4fc12e9d202da41be4c0a934508c22e2f4d81029e5e29ac0073f1" Oct 02 13:00:39 crc kubenswrapper[4724]: E1002 13:00:39.314498 4724 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-w58lt_openshift-ovn-kubernetes(4089ad23-969c-4222-a8ed-e141ec291e80)\"" pod="openshift-ovn-kubernetes/ovnkube-node-w58lt" podUID="4089ad23-969c-4222-a8ed-e141ec291e80" Oct 02 13:00:41 crc kubenswrapper[4724]: I1002 13:00:41.313564 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 13:00:41 crc kubenswrapper[4724]: I1002 13:00:41.313626 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-q7t2t" Oct 02 13:00:41 crc kubenswrapper[4724]: I1002 13:00:41.313670 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 13:00:41 crc kubenswrapper[4724]: I1002 13:00:41.313564 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 13:00:41 crc kubenswrapper[4724]: E1002 13:00:41.313746 4724 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 02 13:00:41 crc kubenswrapper[4724]: E1002 13:00:41.313867 4724 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 02 13:00:41 crc kubenswrapper[4724]: E1002 13:00:41.313981 4724 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 02 13:00:41 crc kubenswrapper[4724]: E1002 13:00:41.314081 4724 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-q7t2t" podUID="32e04071-6b34-4fc0-9783-f346a72fcf99" Oct 02 13:00:43 crc kubenswrapper[4724]: I1002 13:00:43.313256 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-q7t2t" Oct 02 13:00:43 crc kubenswrapper[4724]: I1002 13:00:43.313324 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 13:00:43 crc kubenswrapper[4724]: E1002 13:00:43.313403 4724 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-q7t2t" podUID="32e04071-6b34-4fc0-9783-f346a72fcf99" Oct 02 13:00:43 crc kubenswrapper[4724]: I1002 13:00:43.313414 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 13:00:43 crc kubenswrapper[4724]: I1002 13:00:43.313488 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 13:00:43 crc kubenswrapper[4724]: E1002 13:00:43.313809 4724 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 02 13:00:43 crc kubenswrapper[4724]: E1002 13:00:43.313979 4724 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 02 13:00:43 crc kubenswrapper[4724]: E1002 13:00:43.314015 4724 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 02 13:00:43 crc kubenswrapper[4724]: I1002 13:00:43.327598 4724 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-cwxmn" podStartSLOduration=83.327580002 podStartE2EDuration="1m23.327580002s" podCreationTimestamp="2025-10-02 12:59:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 13:00:38.96003011 +0000 UTC m=+103.414789241" watchObservedRunningTime="2025-10-02 13:00:43.327580002 +0000 UTC m=+107.782339123" Oct 02 13:00:43 crc kubenswrapper[4724]: I1002 13:00:43.328434 4724 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd/etcd-crc"] Oct 02 13:00:45 crc kubenswrapper[4724]: I1002 13:00:45.313004 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 13:00:45 crc kubenswrapper[4724]: I1002 13:00:45.313019 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-q7t2t" Oct 02 13:00:45 crc kubenswrapper[4724]: I1002 13:00:45.313087 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 13:00:45 crc kubenswrapper[4724]: I1002 13:00:45.313173 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 13:00:45 crc kubenswrapper[4724]: E1002 13:00:45.313232 4724 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 02 13:00:45 crc kubenswrapper[4724]: E1002 13:00:45.313281 4724 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 02 13:00:45 crc kubenswrapper[4724]: E1002 13:00:45.313345 4724 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 02 13:00:45 crc kubenswrapper[4724]: E1002 13:00:45.313495 4724 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-q7t2t" podUID="32e04071-6b34-4fc0-9783-f346a72fcf99" Oct 02 13:00:46 crc kubenswrapper[4724]: I1002 13:00:46.339287 4724 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd/etcd-crc" podStartSLOduration=3.339270273 podStartE2EDuration="3.339270273s" podCreationTimestamp="2025-10-02 13:00:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 13:00:46.338331209 +0000 UTC m=+110.793090360" watchObservedRunningTime="2025-10-02 13:00:46.339270273 +0000 UTC m=+110.794029394" Oct 02 13:00:47 crc kubenswrapper[4724]: I1002 13:00:47.313114 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 13:00:47 crc kubenswrapper[4724]: I1002 13:00:47.313141 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-q7t2t" Oct 02 13:00:47 crc kubenswrapper[4724]: I1002 13:00:47.313191 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 13:00:47 crc kubenswrapper[4724]: E1002 13:00:47.313309 4724 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 02 13:00:47 crc kubenswrapper[4724]: I1002 13:00:47.313338 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 13:00:47 crc kubenswrapper[4724]: E1002 13:00:47.313431 4724 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-q7t2t" podUID="32e04071-6b34-4fc0-9783-f346a72fcf99" Oct 02 13:00:47 crc kubenswrapper[4724]: E1002 13:00:47.313575 4724 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 02 13:00:47 crc kubenswrapper[4724]: E1002 13:00:47.313648 4724 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 02 13:00:49 crc kubenswrapper[4724]: I1002 13:00:49.312694 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 13:00:49 crc kubenswrapper[4724]: I1002 13:00:49.312752 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-q7t2t" Oct 02 13:00:49 crc kubenswrapper[4724]: I1002 13:00:49.312695 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 13:00:49 crc kubenswrapper[4724]: I1002 13:00:49.312824 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 13:00:49 crc kubenswrapper[4724]: E1002 13:00:49.313825 4724 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 02 13:00:49 crc kubenswrapper[4724]: E1002 13:00:49.313963 4724 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-q7t2t" podUID="32e04071-6b34-4fc0-9783-f346a72fcf99" Oct 02 13:00:49 crc kubenswrapper[4724]: E1002 13:00:49.314145 4724 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 02 13:00:49 crc kubenswrapper[4724]: E1002 13:00:49.314247 4724 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 02 13:00:51 crc kubenswrapper[4724]: I1002 13:00:51.312931 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-q7t2t" Oct 02 13:00:51 crc kubenswrapper[4724]: I1002 13:00:51.312946 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 13:00:51 crc kubenswrapper[4724]: I1002 13:00:51.313105 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 13:00:51 crc kubenswrapper[4724]: E1002 13:00:51.313234 4724 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-q7t2t" podUID="32e04071-6b34-4fc0-9783-f346a72fcf99" Oct 02 13:00:51 crc kubenswrapper[4724]: E1002 13:00:51.313316 4724 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 02 13:00:51 crc kubenswrapper[4724]: E1002 13:00:51.313385 4724 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 02 13:00:51 crc kubenswrapper[4724]: I1002 13:00:51.313652 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 13:00:51 crc kubenswrapper[4724]: E1002 13:00:51.313736 4724 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 02 13:00:52 crc kubenswrapper[4724]: I1002 13:00:52.313887 4724 scope.go:117] "RemoveContainer" containerID="6c6b3b7fc0c4fc12e9d202da41be4c0a934508c22e2f4d81029e5e29ac0073f1" Oct 02 13:00:52 crc kubenswrapper[4724]: E1002 13:00:52.314048 4724 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-w58lt_openshift-ovn-kubernetes(4089ad23-969c-4222-a8ed-e141ec291e80)\"" pod="openshift-ovn-kubernetes/ovnkube-node-w58lt" podUID="4089ad23-969c-4222-a8ed-e141ec291e80" Oct 02 13:00:53 crc kubenswrapper[4724]: I1002 13:00:53.313380 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 13:00:53 crc kubenswrapper[4724]: I1002 13:00:53.313482 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-q7t2t" Oct 02 13:00:53 crc kubenswrapper[4724]: I1002 13:00:53.313380 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 13:00:53 crc kubenswrapper[4724]: I1002 13:00:53.313402 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 13:00:53 crc kubenswrapper[4724]: E1002 13:00:53.313606 4724 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 02 13:00:53 crc kubenswrapper[4724]: E1002 13:00:53.313693 4724 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 02 13:00:53 crc kubenswrapper[4724]: E1002 13:00:53.314002 4724 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-q7t2t" podUID="32e04071-6b34-4fc0-9783-f346a72fcf99" Oct 02 13:00:53 crc kubenswrapper[4724]: E1002 13:00:53.314162 4724 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 02 13:00:55 crc kubenswrapper[4724]: I1002 13:00:55.313627 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 13:00:55 crc kubenswrapper[4724]: I1002 13:00:55.313651 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 13:00:55 crc kubenswrapper[4724]: I1002 13:00:55.313863 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 13:00:55 crc kubenswrapper[4724]: I1002 13:00:55.313970 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-q7t2t" Oct 02 13:00:55 crc kubenswrapper[4724]: E1002 13:00:55.313892 4724 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 02 13:00:55 crc kubenswrapper[4724]: E1002 13:00:55.314096 4724 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 02 13:00:55 crc kubenswrapper[4724]: E1002 13:00:55.313761 4724 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 02 13:00:55 crc kubenswrapper[4724]: E1002 13:00:55.314175 4724 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-q7t2t" podUID="32e04071-6b34-4fc0-9783-f346a72fcf99" Oct 02 13:00:55 crc kubenswrapper[4724]: I1002 13:00:55.999315 4724 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-pr276_c67e3c27-01fd-47f5-a7fb-a2ae1e7753d4/kube-multus/1.log" Oct 02 13:00:56 crc kubenswrapper[4724]: I1002 13:00:56.000285 4724 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-pr276_c67e3c27-01fd-47f5-a7fb-a2ae1e7753d4/kube-multus/0.log" Oct 02 13:00:56 crc kubenswrapper[4724]: I1002 13:00:56.000339 4724 generic.go:334] "Generic (PLEG): container finished" podID="c67e3c27-01fd-47f5-a7fb-a2ae1e7753d4" containerID="f6c546eb3faa335d37ee1c08a88b2d409a5ee23ed42ca9fd7104fb0bb76ecf15" exitCode=1 Oct 02 13:00:56 crc kubenswrapper[4724]: I1002 13:00:56.000400 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-pr276" event={"ID":"c67e3c27-01fd-47f5-a7fb-a2ae1e7753d4","Type":"ContainerDied","Data":"f6c546eb3faa335d37ee1c08a88b2d409a5ee23ed42ca9fd7104fb0bb76ecf15"} Oct 02 13:00:56 crc kubenswrapper[4724]: I1002 13:00:56.000516 4724 scope.go:117] "RemoveContainer" containerID="5fe9713cb9b004911140ef4802a6755a4d2a781fde439d7318ace93e4b608cef" Oct 02 13:00:56 crc kubenswrapper[4724]: I1002 13:00:56.001152 4724 scope.go:117] "RemoveContainer" containerID="f6c546eb3faa335d37ee1c08a88b2d409a5ee23ed42ca9fd7104fb0bb76ecf15" Oct 02 13:00:56 crc kubenswrapper[4724]: E1002 13:00:56.001395 4724 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-multus pod=multus-pr276_openshift-multus(c67e3c27-01fd-47f5-a7fb-a2ae1e7753d4)\"" pod="openshift-multus/multus-pr276" podUID="c67e3c27-01fd-47f5-a7fb-a2ae1e7753d4" Oct 02 13:00:56 crc kubenswrapper[4724]: E1002 13:00:56.269309 4724 kubelet_node_status.go:497] "Node not becoming ready in time after startup" Oct 02 13:00:56 crc kubenswrapper[4724]: E1002 13:00:56.403286 4724 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Oct 02 13:00:57 crc kubenswrapper[4724]: I1002 13:00:57.004988 4724 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-pr276_c67e3c27-01fd-47f5-a7fb-a2ae1e7753d4/kube-multus/1.log" Oct 02 13:00:57 crc kubenswrapper[4724]: I1002 13:00:57.313464 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 13:00:57 crc kubenswrapper[4724]: I1002 13:00:57.313477 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 13:00:57 crc kubenswrapper[4724]: I1002 13:00:57.313479 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-q7t2t" Oct 02 13:00:57 crc kubenswrapper[4724]: I1002 13:00:57.313678 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 13:00:57 crc kubenswrapper[4724]: E1002 13:00:57.313861 4724 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 02 13:00:57 crc kubenswrapper[4724]: E1002 13:00:57.313961 4724 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-q7t2t" podUID="32e04071-6b34-4fc0-9783-f346a72fcf99" Oct 02 13:00:57 crc kubenswrapper[4724]: E1002 13:00:57.313991 4724 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 02 13:00:57 crc kubenswrapper[4724]: E1002 13:00:57.314132 4724 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 02 13:00:59 crc kubenswrapper[4724]: I1002 13:00:59.313068 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 13:00:59 crc kubenswrapper[4724]: I1002 13:00:59.313127 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 13:00:59 crc kubenswrapper[4724]: E1002 13:00:59.313215 4724 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 02 13:00:59 crc kubenswrapper[4724]: E1002 13:00:59.313336 4724 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 02 13:00:59 crc kubenswrapper[4724]: I1002 13:00:59.313397 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-q7t2t" Oct 02 13:00:59 crc kubenswrapper[4724]: E1002 13:00:59.313449 4724 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-q7t2t" podUID="32e04071-6b34-4fc0-9783-f346a72fcf99" Oct 02 13:00:59 crc kubenswrapper[4724]: I1002 13:00:59.313068 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 13:00:59 crc kubenswrapper[4724]: E1002 13:00:59.313526 4724 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 02 13:01:01 crc kubenswrapper[4724]: I1002 13:01:01.313458 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 13:01:01 crc kubenswrapper[4724]: I1002 13:01:01.313729 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 13:01:01 crc kubenswrapper[4724]: I1002 13:01:01.313777 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 13:01:01 crc kubenswrapper[4724]: I1002 13:01:01.313832 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-q7t2t" Oct 02 13:01:01 crc kubenswrapper[4724]: E1002 13:01:01.314308 4724 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 02 13:01:01 crc kubenswrapper[4724]: E1002 13:01:01.314345 4724 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 02 13:01:01 crc kubenswrapper[4724]: E1002 13:01:01.314556 4724 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-q7t2t" podUID="32e04071-6b34-4fc0-9783-f346a72fcf99" Oct 02 13:01:01 crc kubenswrapper[4724]: E1002 13:01:01.314869 4724 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 02 13:01:01 crc kubenswrapper[4724]: E1002 13:01:01.404850 4724 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Oct 02 13:01:03 crc kubenswrapper[4724]: I1002 13:01:03.312945 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 13:01:03 crc kubenswrapper[4724]: I1002 13:01:03.312976 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 13:01:03 crc kubenswrapper[4724]: E1002 13:01:03.313104 4724 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 02 13:01:03 crc kubenswrapper[4724]: I1002 13:01:03.312999 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-q7t2t" Oct 02 13:01:03 crc kubenswrapper[4724]: E1002 13:01:03.313283 4724 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 02 13:01:03 crc kubenswrapper[4724]: E1002 13:01:03.313357 4724 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-q7t2t" podUID="32e04071-6b34-4fc0-9783-f346a72fcf99" Oct 02 13:01:03 crc kubenswrapper[4724]: I1002 13:01:03.313157 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 13:01:03 crc kubenswrapper[4724]: E1002 13:01:03.313451 4724 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 02 13:01:05 crc kubenswrapper[4724]: I1002 13:01:05.313353 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-q7t2t" Oct 02 13:01:05 crc kubenswrapper[4724]: I1002 13:01:05.313366 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 13:01:05 crc kubenswrapper[4724]: I1002 13:01:05.313384 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 13:01:05 crc kubenswrapper[4724]: I1002 13:01:05.313484 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 13:01:05 crc kubenswrapper[4724]: E1002 13:01:05.313774 4724 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-q7t2t" podUID="32e04071-6b34-4fc0-9783-f346a72fcf99" Oct 02 13:01:05 crc kubenswrapper[4724]: E1002 13:01:05.313845 4724 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 02 13:01:05 crc kubenswrapper[4724]: E1002 13:01:05.313905 4724 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 02 13:01:05 crc kubenswrapper[4724]: E1002 13:01:05.313954 4724 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 02 13:01:05 crc kubenswrapper[4724]: I1002 13:01:05.314195 4724 scope.go:117] "RemoveContainer" containerID="6c6b3b7fc0c4fc12e9d202da41be4c0a934508c22e2f4d81029e5e29ac0073f1" Oct 02 13:01:06 crc kubenswrapper[4724]: I1002 13:01:06.033709 4724 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-w58lt_4089ad23-969c-4222-a8ed-e141ec291e80/ovnkube-controller/3.log" Oct 02 13:01:06 crc kubenswrapper[4724]: I1002 13:01:06.036835 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-w58lt" event={"ID":"4089ad23-969c-4222-a8ed-e141ec291e80","Type":"ContainerStarted","Data":"8e820afb07a66f1eb4b72ab3a92ba9dbd09e42cf1a2994c9dbca241a450e51fb"} Oct 02 13:01:06 crc kubenswrapper[4724]: I1002 13:01:06.037251 4724 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-w58lt" Oct 02 13:01:06 crc kubenswrapper[4724]: I1002 13:01:06.066694 4724 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-w58lt" podStartSLOduration=105.066676389 podStartE2EDuration="1m45.066676389s" podCreationTimestamp="2025-10-02 12:59:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 13:01:06.066376461 +0000 UTC m=+130.521135602" watchObservedRunningTime="2025-10-02 13:01:06.066676389 +0000 UTC m=+130.521435520" Oct 02 13:01:06 crc kubenswrapper[4724]: I1002 13:01:06.167327 4724 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-q7t2t"] Oct 02 13:01:06 crc kubenswrapper[4724]: I1002 13:01:06.167447 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-q7t2t" Oct 02 13:01:06 crc kubenswrapper[4724]: E1002 13:01:06.167548 4724 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-q7t2t" podUID="32e04071-6b34-4fc0-9783-f346a72fcf99" Oct 02 13:01:06 crc kubenswrapper[4724]: E1002 13:01:06.405812 4724 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Oct 02 13:01:07 crc kubenswrapper[4724]: I1002 13:01:07.313680 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 13:01:07 crc kubenswrapper[4724]: E1002 13:01:07.313846 4724 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 02 13:01:07 crc kubenswrapper[4724]: I1002 13:01:07.313706 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 13:01:07 crc kubenswrapper[4724]: I1002 13:01:07.313717 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 13:01:07 crc kubenswrapper[4724]: I1002 13:01:07.313691 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-q7t2t" Oct 02 13:01:07 crc kubenswrapper[4724]: E1002 13:01:07.314030 4724 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 02 13:01:07 crc kubenswrapper[4724]: E1002 13:01:07.313932 4724 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 02 13:01:07 crc kubenswrapper[4724]: I1002 13:01:07.314061 4724 scope.go:117] "RemoveContainer" containerID="f6c546eb3faa335d37ee1c08a88b2d409a5ee23ed42ca9fd7104fb0bb76ecf15" Oct 02 13:01:07 crc kubenswrapper[4724]: E1002 13:01:07.314243 4724 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-q7t2t" podUID="32e04071-6b34-4fc0-9783-f346a72fcf99" Oct 02 13:01:08 crc kubenswrapper[4724]: I1002 13:01:08.045646 4724 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-pr276_c67e3c27-01fd-47f5-a7fb-a2ae1e7753d4/kube-multus/1.log" Oct 02 13:01:08 crc kubenswrapper[4724]: I1002 13:01:08.045950 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-pr276" event={"ID":"c67e3c27-01fd-47f5-a7fb-a2ae1e7753d4","Type":"ContainerStarted","Data":"14f9a64b6c087079ffb4c1374976c3a597724a7ba274f00574e30df84ed84076"} Oct 02 13:01:09 crc kubenswrapper[4724]: I1002 13:01:09.313444 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 13:01:09 crc kubenswrapper[4724]: I1002 13:01:09.313467 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-q7t2t" Oct 02 13:01:09 crc kubenswrapper[4724]: E1002 13:01:09.314327 4724 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 02 13:01:09 crc kubenswrapper[4724]: I1002 13:01:09.313528 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 13:01:09 crc kubenswrapper[4724]: E1002 13:01:09.314322 4724 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-q7t2t" podUID="32e04071-6b34-4fc0-9783-f346a72fcf99" Oct 02 13:01:09 crc kubenswrapper[4724]: I1002 13:01:09.313498 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 13:01:09 crc kubenswrapper[4724]: E1002 13:01:09.314875 4724 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 02 13:01:09 crc kubenswrapper[4724]: E1002 13:01:09.315035 4724 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 02 13:01:10 crc kubenswrapper[4724]: I1002 13:01:10.324265 4724 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-w58lt" Oct 02 13:01:11 crc kubenswrapper[4724]: I1002 13:01:11.312611 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 13:01:11 crc kubenswrapper[4724]: I1002 13:01:11.312671 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-q7t2t" Oct 02 13:01:11 crc kubenswrapper[4724]: I1002 13:01:11.312708 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 13:01:11 crc kubenswrapper[4724]: E1002 13:01:11.312767 4724 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 02 13:01:11 crc kubenswrapper[4724]: I1002 13:01:11.312819 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 13:01:11 crc kubenswrapper[4724]: E1002 13:01:11.312857 4724 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 02 13:01:11 crc kubenswrapper[4724]: E1002 13:01:11.313147 4724 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 02 13:01:11 crc kubenswrapper[4724]: E1002 13:01:11.313415 4724 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-q7t2t" podUID="32e04071-6b34-4fc0-9783-f346a72fcf99" Oct 02 13:01:13 crc kubenswrapper[4724]: I1002 13:01:13.312916 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-q7t2t" Oct 02 13:01:13 crc kubenswrapper[4724]: I1002 13:01:13.313499 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 13:01:13 crc kubenswrapper[4724]: I1002 13:01:13.313817 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 13:01:13 crc kubenswrapper[4724]: I1002 13:01:13.314430 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 13:01:13 crc kubenswrapper[4724]: I1002 13:01:13.315850 4724 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Oct 02 13:01:13 crc kubenswrapper[4724]: I1002 13:01:13.316587 4724 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Oct 02 13:01:13 crc kubenswrapper[4724]: I1002 13:01:13.316726 4724 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Oct 02 13:01:13 crc kubenswrapper[4724]: I1002 13:01:13.316927 4724 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Oct 02 13:01:13 crc kubenswrapper[4724]: I1002 13:01:13.317840 4724 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Oct 02 13:01:13 crc kubenswrapper[4724]: I1002 13:01:13.318322 4724 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Oct 02 13:01:18 crc kubenswrapper[4724]: I1002 13:01:18.454261 4724 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeReady" Oct 02 13:01:18 crc kubenswrapper[4724]: I1002 13:01:18.494128 4724 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-xjrx9"] Oct 02 13:01:18 crc kubenswrapper[4724]: I1002 13:01:18.503008 4724 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-x726v"] Oct 02 13:01:18 crc kubenswrapper[4724]: I1002 13:01:18.503364 4724 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-machine-approver/machine-approver-56656f9798-zl9kr"] Oct 02 13:01:18 crc kubenswrapper[4724]: I1002 13:01:18.505303 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-xjrx9" Oct 02 13:01:18 crc kubenswrapper[4724]: I1002 13:01:18.506134 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-x726v" Oct 02 13:01:18 crc kubenswrapper[4724]: I1002 13:01:18.507769 4724 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-zpmh7"] Oct 02 13:01:18 crc kubenswrapper[4724]: I1002 13:01:18.508170 4724 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-st27v"] Oct 02 13:01:18 crc kubenswrapper[4724]: I1002 13:01:18.508356 4724 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-n4xw8"] Oct 02 13:01:18 crc kubenswrapper[4724]: I1002 13:01:18.508655 4724 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-qxfvd"] Oct 02 13:01:18 crc kubenswrapper[4724]: I1002 13:01:18.509004 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-qxfvd" Oct 02 13:01:18 crc kubenswrapper[4724]: I1002 13:01:18.509672 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-zl9kr" Oct 02 13:01:18 crc kubenswrapper[4724]: I1002 13:01:18.509995 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-zpmh7" Oct 02 13:01:18 crc kubenswrapper[4724]: I1002 13:01:18.510322 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-st27v" Oct 02 13:01:18 crc kubenswrapper[4724]: I1002 13:01:18.510786 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-n4xw8" Oct 02 13:01:18 crc kubenswrapper[4724]: I1002 13:01:18.512975 4724 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-974gc"] Oct 02 13:01:18 crc kubenswrapper[4724]: I1002 13:01:18.525070 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-974gc" Oct 02 13:01:18 crc kubenswrapper[4724]: I1002 13:01:18.529693 4724 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Oct 02 13:01:18 crc kubenswrapper[4724]: I1002 13:01:18.530779 4724 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Oct 02 13:01:18 crc kubenswrapper[4724]: I1002 13:01:18.530915 4724 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Oct 02 13:01:18 crc kubenswrapper[4724]: I1002 13:01:18.530777 4724 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Oct 02 13:01:18 crc kubenswrapper[4724]: I1002 13:01:18.531104 4724 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Oct 02 13:01:18 crc kubenswrapper[4724]: I1002 13:01:18.531280 4724 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Oct 02 13:01:18 crc kubenswrapper[4724]: I1002 13:01:18.531459 4724 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Oct 02 13:01:18 crc kubenswrapper[4724]: I1002 13:01:18.531566 4724 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Oct 02 13:01:18 crc kubenswrapper[4724]: I1002 13:01:18.531686 4724 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Oct 02 13:01:18 crc kubenswrapper[4724]: I1002 13:01:18.531752 4724 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Oct 02 13:01:18 crc kubenswrapper[4724]: I1002 13:01:18.531850 4724 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Oct 02 13:01:18 crc kubenswrapper[4724]: I1002 13:01:18.532335 4724 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Oct 02 13:01:18 crc kubenswrapper[4724]: I1002 13:01:18.533099 4724 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Oct 02 13:01:18 crc kubenswrapper[4724]: I1002 13:01:18.533365 4724 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Oct 02 13:01:18 crc kubenswrapper[4724]: I1002 13:01:18.534339 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e0f09ca7-146f-44c3-9137-76cff079b5bc-service-ca-bundle\") pod \"authentication-operator-69f744f599-st27v\" (UID: \"e0f09ca7-146f-44c3-9137-76cff079b5bc\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-st27v" Oct 02 13:01:18 crc kubenswrapper[4724]: I1002 13:01:18.534379 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bbeff7ab-5d85-4709-bd85-22d6e99ff30c-config\") pod \"route-controller-manager-6576b87f9c-qxfvd\" (UID: \"bbeff7ab-5d85-4709-bd85-22d6e99ff30c\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-qxfvd" Oct 02 13:01:18 crc kubenswrapper[4724]: I1002 13:01:18.534403 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/da7051e3-8a79-43e7-9016-9d492b51a9fd-audit\") pod \"apiserver-76f77b778f-xjrx9\" (UID: \"da7051e3-8a79-43e7-9016-9d492b51a9fd\") " pod="openshift-apiserver/apiserver-76f77b778f-xjrx9" Oct 02 13:01:18 crc kubenswrapper[4724]: I1002 13:01:18.534425 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/27c39377-9fc1-4ca9-8ce4-8a1c61f181c0-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-x726v\" (UID: \"27c39377-9fc1-4ca9-8ce4-8a1c61f181c0\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-x726v" Oct 02 13:01:18 crc kubenswrapper[4724]: I1002 13:01:18.534449 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5gg2l\" (UniqueName: \"kubernetes.io/projected/27c39377-9fc1-4ca9-8ce4-8a1c61f181c0-kube-api-access-5gg2l\") pod \"machine-api-operator-5694c8668f-x726v\" (UID: \"27c39377-9fc1-4ca9-8ce4-8a1c61f181c0\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-x726v" Oct 02 13:01:18 crc kubenswrapper[4724]: I1002 13:01:18.534470 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mrq7m\" (UniqueName: \"kubernetes.io/projected/486adec0-0d45-4294-ac1b-5fff0dda6602-kube-api-access-mrq7m\") pod \"machine-approver-56656f9798-zl9kr\" (UID: \"486adec0-0d45-4294-ac1b-5fff0dda6602\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-zl9kr" Oct 02 13:01:18 crc kubenswrapper[4724]: I1002 13:01:18.534488 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e0f09ca7-146f-44c3-9137-76cff079b5bc-config\") pod \"authentication-operator-69f744f599-st27v\" (UID: \"e0f09ca7-146f-44c3-9137-76cff079b5bc\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-st27v" Oct 02 13:01:18 crc kubenswrapper[4724]: I1002 13:01:18.534508 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/4501d69c-964b-4444-b8af-d56b9301a685-audit-policies\") pod \"apiserver-7bbb656c7d-n4xw8\" (UID: \"4501d69c-964b-4444-b8af-d56b9301a685\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-n4xw8" Oct 02 13:01:18 crc kubenswrapper[4724]: I1002 13:01:18.536802 4724 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-w4dqz"] Oct 02 13:01:18 crc kubenswrapper[4724]: I1002 13:01:18.538192 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-w4dqz" Oct 02 13:01:18 crc kubenswrapper[4724]: I1002 13:01:18.534527 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c8k7f\" (UniqueName: \"kubernetes.io/projected/bbeff7ab-5d85-4709-bd85-22d6e99ff30c-kube-api-access-c8k7f\") pod \"route-controller-manager-6576b87f9c-qxfvd\" (UID: \"bbeff7ab-5d85-4709-bd85-22d6e99ff30c\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-qxfvd" Oct 02 13:01:18 crc kubenswrapper[4724]: I1002 13:01:18.538708 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bbeff7ab-5d85-4709-bd85-22d6e99ff30c-serving-cert\") pod \"route-controller-manager-6576b87f9c-qxfvd\" (UID: \"bbeff7ab-5d85-4709-bd85-22d6e99ff30c\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-qxfvd" Oct 02 13:01:18 crc kubenswrapper[4724]: I1002 13:01:18.538732 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/27c39377-9fc1-4ca9-8ce4-8a1c61f181c0-config\") pod \"machine-api-operator-5694c8668f-x726v\" (UID: \"27c39377-9fc1-4ca9-8ce4-8a1c61f181c0\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-x726v" Oct 02 13:01:18 crc kubenswrapper[4724]: I1002 13:01:18.538753 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/da7051e3-8a79-43e7-9016-9d492b51a9fd-node-pullsecrets\") pod \"apiserver-76f77b778f-xjrx9\" (UID: \"da7051e3-8a79-43e7-9016-9d492b51a9fd\") " pod="openshift-apiserver/apiserver-76f77b778f-xjrx9" Oct 02 13:01:18 crc kubenswrapper[4724]: I1002 13:01:18.538775 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/27c39377-9fc1-4ca9-8ce4-8a1c61f181c0-images\") pod \"machine-api-operator-5694c8668f-x726v\" (UID: \"27c39377-9fc1-4ca9-8ce4-8a1c61f181c0\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-x726v" Oct 02 13:01:18 crc kubenswrapper[4724]: I1002 13:01:18.538801 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4f4b9ef0-2fa1-4d48-81c9-e428e93c7034-serving-cert\") pod \"openshift-config-operator-7777fb866f-974gc\" (UID: \"4f4b9ef0-2fa1-4d48-81c9-e428e93c7034\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-974gc" Oct 02 13:01:18 crc kubenswrapper[4724]: I1002 13:01:18.538828 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/da7051e3-8a79-43e7-9016-9d492b51a9fd-encryption-config\") pod \"apiserver-76f77b778f-xjrx9\" (UID: \"da7051e3-8a79-43e7-9016-9d492b51a9fd\") " pod="openshift-apiserver/apiserver-76f77b778f-xjrx9" Oct 02 13:01:18 crc kubenswrapper[4724]: I1002 13:01:18.538848 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9699ff16-3d72-4ba6-9055-6b707c3e223f-serving-cert\") pod \"controller-manager-879f6c89f-zpmh7\" (UID: \"9699ff16-3d72-4ba6-9055-6b707c3e223f\") " pod="openshift-controller-manager/controller-manager-879f6c89f-zpmh7" Oct 02 13:01:18 crc kubenswrapper[4724]: I1002 13:01:18.538871 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/4501d69c-964b-4444-b8af-d56b9301a685-etcd-client\") pod \"apiserver-7bbb656c7d-n4xw8\" (UID: \"4501d69c-964b-4444-b8af-d56b9301a685\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-n4xw8" Oct 02 13:01:18 crc kubenswrapper[4724]: I1002 13:01:18.538895 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-48p5f\" (UniqueName: \"kubernetes.io/projected/4501d69c-964b-4444-b8af-d56b9301a685-kube-api-access-48p5f\") pod \"apiserver-7bbb656c7d-n4xw8\" (UID: \"4501d69c-964b-4444-b8af-d56b9301a685\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-n4xw8" Oct 02 13:01:18 crc kubenswrapper[4724]: I1002 13:01:18.538918 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9699ff16-3d72-4ba6-9055-6b707c3e223f-config\") pod \"controller-manager-879f6c89f-zpmh7\" (UID: \"9699ff16-3d72-4ba6-9055-6b707c3e223f\") " pod="openshift-controller-manager/controller-manager-879f6c89f-zpmh7" Oct 02 13:01:18 crc kubenswrapper[4724]: I1002 13:01:18.538932 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/9699ff16-3d72-4ba6-9055-6b707c3e223f-client-ca\") pod \"controller-manager-879f6c89f-zpmh7\" (UID: \"9699ff16-3d72-4ba6-9055-6b707c3e223f\") " pod="openshift-controller-manager/controller-manager-879f6c89f-zpmh7" Oct 02 13:01:18 crc kubenswrapper[4724]: I1002 13:01:18.538956 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e0f09ca7-146f-44c3-9137-76cff079b5bc-serving-cert\") pod \"authentication-operator-69f744f599-st27v\" (UID: \"e0f09ca7-146f-44c3-9137-76cff079b5bc\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-st27v" Oct 02 13:01:18 crc kubenswrapper[4724]: I1002 13:01:18.538986 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/da7051e3-8a79-43e7-9016-9d492b51a9fd-serving-cert\") pod \"apiserver-76f77b778f-xjrx9\" (UID: \"da7051e3-8a79-43e7-9016-9d492b51a9fd\") " pod="openshift-apiserver/apiserver-76f77b778f-xjrx9" Oct 02 13:01:18 crc kubenswrapper[4724]: I1002 13:01:18.539005 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4501d69c-964b-4444-b8af-d56b9301a685-serving-cert\") pod \"apiserver-7bbb656c7d-n4xw8\" (UID: \"4501d69c-964b-4444-b8af-d56b9301a685\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-n4xw8" Oct 02 13:01:18 crc kubenswrapper[4724]: I1002 13:01:18.539024 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/4501d69c-964b-4444-b8af-d56b9301a685-audit-dir\") pod \"apiserver-7bbb656c7d-n4xw8\" (UID: \"4501d69c-964b-4444-b8af-d56b9301a685\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-n4xw8" Oct 02 13:01:18 crc kubenswrapper[4724]: I1002 13:01:18.539041 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mr526\" (UniqueName: \"kubernetes.io/projected/e0f09ca7-146f-44c3-9137-76cff079b5bc-kube-api-access-mr526\") pod \"authentication-operator-69f744f599-st27v\" (UID: \"e0f09ca7-146f-44c3-9137-76cff079b5bc\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-st27v" Oct 02 13:01:18 crc kubenswrapper[4724]: I1002 13:01:18.539061 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/da7051e3-8a79-43e7-9016-9d492b51a9fd-image-import-ca\") pod \"apiserver-76f77b778f-xjrx9\" (UID: \"da7051e3-8a79-43e7-9016-9d492b51a9fd\") " pod="openshift-apiserver/apiserver-76f77b778f-xjrx9" Oct 02 13:01:18 crc kubenswrapper[4724]: I1002 13:01:18.539077 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/4501d69c-964b-4444-b8af-d56b9301a685-encryption-config\") pod \"apiserver-7bbb656c7d-n4xw8\" (UID: \"4501d69c-964b-4444-b8af-d56b9301a685\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-n4xw8" Oct 02 13:01:18 crc kubenswrapper[4724]: I1002 13:01:18.539112 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/4501d69c-964b-4444-b8af-d56b9301a685-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-n4xw8\" (UID: \"4501d69c-964b-4444-b8af-d56b9301a685\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-n4xw8" Oct 02 13:01:18 crc kubenswrapper[4724]: I1002 13:01:18.539139 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/da7051e3-8a79-43e7-9016-9d492b51a9fd-audit-dir\") pod \"apiserver-76f77b778f-xjrx9\" (UID: \"da7051e3-8a79-43e7-9016-9d492b51a9fd\") " pod="openshift-apiserver/apiserver-76f77b778f-xjrx9" Oct 02 13:01:18 crc kubenswrapper[4724]: I1002 13:01:18.539166 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/486adec0-0d45-4294-ac1b-5fff0dda6602-machine-approver-tls\") pod \"machine-approver-56656f9798-zl9kr\" (UID: \"486adec0-0d45-4294-ac1b-5fff0dda6602\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-zl9kr" Oct 02 13:01:18 crc kubenswrapper[4724]: I1002 13:01:18.539254 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/da7051e3-8a79-43e7-9016-9d492b51a9fd-etcd-client\") pod \"apiserver-76f77b778f-xjrx9\" (UID: \"da7051e3-8a79-43e7-9016-9d492b51a9fd\") " pod="openshift-apiserver/apiserver-76f77b778f-xjrx9" Oct 02 13:01:18 crc kubenswrapper[4724]: I1002 13:01:18.539332 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/da7051e3-8a79-43e7-9016-9d492b51a9fd-etcd-serving-ca\") pod \"apiserver-76f77b778f-xjrx9\" (UID: \"da7051e3-8a79-43e7-9016-9d492b51a9fd\") " pod="openshift-apiserver/apiserver-76f77b778f-xjrx9" Oct 02 13:01:18 crc kubenswrapper[4724]: I1002 13:01:18.539384 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4501d69c-964b-4444-b8af-d56b9301a685-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-n4xw8\" (UID: \"4501d69c-964b-4444-b8af-d56b9301a685\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-n4xw8" Oct 02 13:01:18 crc kubenswrapper[4724]: I1002 13:01:18.539415 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/486adec0-0d45-4294-ac1b-5fff0dda6602-config\") pod \"machine-approver-56656f9798-zl9kr\" (UID: \"486adec0-0d45-4294-ac1b-5fff0dda6602\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-zl9kr" Oct 02 13:01:18 crc kubenswrapper[4724]: I1002 13:01:18.539479 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/bbeff7ab-5d85-4709-bd85-22d6e99ff30c-client-ca\") pod \"route-controller-manager-6576b87f9c-qxfvd\" (UID: \"bbeff7ab-5d85-4709-bd85-22d6e99ff30c\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-qxfvd" Oct 02 13:01:18 crc kubenswrapper[4724]: I1002 13:01:18.539517 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/da7051e3-8a79-43e7-9016-9d492b51a9fd-trusted-ca-bundle\") pod \"apiserver-76f77b778f-xjrx9\" (UID: \"da7051e3-8a79-43e7-9016-9d492b51a9fd\") " pod="openshift-apiserver/apiserver-76f77b778f-xjrx9" Oct 02 13:01:18 crc kubenswrapper[4724]: I1002 13:01:18.539621 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/da7051e3-8a79-43e7-9016-9d492b51a9fd-config\") pod \"apiserver-76f77b778f-xjrx9\" (UID: \"da7051e3-8a79-43e7-9016-9d492b51a9fd\") " pod="openshift-apiserver/apiserver-76f77b778f-xjrx9" Oct 02 13:01:18 crc kubenswrapper[4724]: I1002 13:01:18.539667 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/9699ff16-3d72-4ba6-9055-6b707c3e223f-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-zpmh7\" (UID: \"9699ff16-3d72-4ba6-9055-6b707c3e223f\") " pod="openshift-controller-manager/controller-manager-879f6c89f-zpmh7" Oct 02 13:01:18 crc kubenswrapper[4724]: I1002 13:01:18.539699 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e0f09ca7-146f-44c3-9137-76cff079b5bc-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-st27v\" (UID: \"e0f09ca7-146f-44c3-9137-76cff079b5bc\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-st27v" Oct 02 13:01:18 crc kubenswrapper[4724]: I1002 13:01:18.539750 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/486adec0-0d45-4294-ac1b-5fff0dda6602-auth-proxy-config\") pod \"machine-approver-56656f9798-zl9kr\" (UID: \"486adec0-0d45-4294-ac1b-5fff0dda6602\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-zl9kr" Oct 02 13:01:18 crc kubenswrapper[4724]: I1002 13:01:18.539781 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/4f4b9ef0-2fa1-4d48-81c9-e428e93c7034-available-featuregates\") pod \"openshift-config-operator-7777fb866f-974gc\" (UID: \"4f4b9ef0-2fa1-4d48-81c9-e428e93c7034\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-974gc" Oct 02 13:01:18 crc kubenswrapper[4724]: I1002 13:01:18.539820 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b8bl8\" (UniqueName: \"kubernetes.io/projected/da7051e3-8a79-43e7-9016-9d492b51a9fd-kube-api-access-b8bl8\") pod \"apiserver-76f77b778f-xjrx9\" (UID: \"da7051e3-8a79-43e7-9016-9d492b51a9fd\") " pod="openshift-apiserver/apiserver-76f77b778f-xjrx9" Oct 02 13:01:18 crc kubenswrapper[4724]: I1002 13:01:18.539853 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wjrjd\" (UniqueName: \"kubernetes.io/projected/4f4b9ef0-2fa1-4d48-81c9-e428e93c7034-kube-api-access-wjrjd\") pod \"openshift-config-operator-7777fb866f-974gc\" (UID: \"4f4b9ef0-2fa1-4d48-81c9-e428e93c7034\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-974gc" Oct 02 13:01:18 crc kubenswrapper[4724]: I1002 13:01:18.539883 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xn64z\" (UniqueName: \"kubernetes.io/projected/9699ff16-3d72-4ba6-9055-6b707c3e223f-kube-api-access-xn64z\") pod \"controller-manager-879f6c89f-zpmh7\" (UID: \"9699ff16-3d72-4ba6-9055-6b707c3e223f\") " pod="openshift-controller-manager/controller-manager-879f6c89f-zpmh7" Oct 02 13:01:18 crc kubenswrapper[4724]: I1002 13:01:18.544093 4724 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-m8t6v"] Oct 02 13:01:18 crc kubenswrapper[4724]: I1002 13:01:18.545167 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-m8t6v" Oct 02 13:01:18 crc kubenswrapper[4724]: I1002 13:01:18.545460 4724 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-dlnq8"] Oct 02 13:01:18 crc kubenswrapper[4724]: I1002 13:01:18.549841 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-dlnq8" Oct 02 13:01:18 crc kubenswrapper[4724]: I1002 13:01:18.550213 4724 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Oct 02 13:01:18 crc kubenswrapper[4724]: I1002 13:01:18.550778 4724 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Oct 02 13:01:18 crc kubenswrapper[4724]: I1002 13:01:18.551012 4724 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Oct 02 13:01:18 crc kubenswrapper[4724]: I1002 13:01:18.551197 4724 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Oct 02 13:01:18 crc kubenswrapper[4724]: I1002 13:01:18.552371 4724 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Oct 02 13:01:18 crc kubenswrapper[4724]: I1002 13:01:18.552911 4724 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Oct 02 13:01:18 crc kubenswrapper[4724]: I1002 13:01:18.553297 4724 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Oct 02 13:01:18 crc kubenswrapper[4724]: I1002 13:01:18.553702 4724 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Oct 02 13:01:18 crc kubenswrapper[4724]: I1002 13:01:18.554056 4724 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Oct 02 13:01:18 crc kubenswrapper[4724]: I1002 13:01:18.555279 4724 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Oct 02 13:01:18 crc kubenswrapper[4724]: I1002 13:01:18.555641 4724 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Oct 02 13:01:18 crc kubenswrapper[4724]: I1002 13:01:18.555929 4724 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Oct 02 13:01:18 crc kubenswrapper[4724]: I1002 13:01:18.556283 4724 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Oct 02 13:01:18 crc kubenswrapper[4724]: I1002 13:01:18.556347 4724 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Oct 02 13:01:18 crc kubenswrapper[4724]: I1002 13:01:18.556387 4724 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Oct 02 13:01:18 crc kubenswrapper[4724]: I1002 13:01:18.556482 4724 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Oct 02 13:01:18 crc kubenswrapper[4724]: I1002 13:01:18.556626 4724 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Oct 02 13:01:18 crc kubenswrapper[4724]: I1002 13:01:18.556679 4724 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Oct 02 13:01:18 crc kubenswrapper[4724]: I1002 13:01:18.556698 4724 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Oct 02 13:01:18 crc kubenswrapper[4724]: I1002 13:01:18.556791 4724 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Oct 02 13:01:18 crc kubenswrapper[4724]: I1002 13:01:18.556942 4724 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Oct 02 13:01:18 crc kubenswrapper[4724]: I1002 13:01:18.556978 4724 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Oct 02 13:01:18 crc kubenswrapper[4724]: I1002 13:01:18.557015 4724 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Oct 02 13:01:18 crc kubenswrapper[4724]: I1002 13:01:18.556277 4724 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Oct 02 13:01:18 crc kubenswrapper[4724]: I1002 13:01:18.557065 4724 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Oct 02 13:01:18 crc kubenswrapper[4724]: I1002 13:01:18.557166 4724 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Oct 02 13:01:18 crc kubenswrapper[4724]: I1002 13:01:18.557236 4724 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Oct 02 13:01:18 crc kubenswrapper[4724]: I1002 13:01:18.557351 4724 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Oct 02 13:01:18 crc kubenswrapper[4724]: I1002 13:01:18.557388 4724 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Oct 02 13:01:18 crc kubenswrapper[4724]: I1002 13:01:18.557487 4724 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Oct 02 13:01:18 crc kubenswrapper[4724]: I1002 13:01:18.557651 4724 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Oct 02 13:01:18 crc kubenswrapper[4724]: I1002 13:01:18.557808 4724 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Oct 02 13:01:18 crc kubenswrapper[4724]: I1002 13:01:18.558034 4724 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Oct 02 13:01:18 crc kubenswrapper[4724]: I1002 13:01:18.558188 4724 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Oct 02 13:01:18 crc kubenswrapper[4724]: I1002 13:01:18.558344 4724 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Oct 02 13:01:18 crc kubenswrapper[4724]: I1002 13:01:18.558463 4724 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Oct 02 13:01:18 crc kubenswrapper[4724]: I1002 13:01:18.558498 4724 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Oct 02 13:01:18 crc kubenswrapper[4724]: I1002 13:01:18.558614 4724 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Oct 02 13:01:18 crc kubenswrapper[4724]: I1002 13:01:18.559301 4724 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-7954f5f757-kl4xt"] Oct 02 13:01:18 crc kubenswrapper[4724]: I1002 13:01:18.559501 4724 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Oct 02 13:01:18 crc kubenswrapper[4724]: I1002 13:01:18.559937 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-kl4xt" Oct 02 13:01:18 crc kubenswrapper[4724]: I1002 13:01:18.560445 4724 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Oct 02 13:01:18 crc kubenswrapper[4724]: I1002 13:01:18.563814 4724 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Oct 02 13:01:18 crc kubenswrapper[4724]: I1002 13:01:18.565412 4724 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Oct 02 13:01:18 crc kubenswrapper[4724]: I1002 13:01:18.566442 4724 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-tj5h8"] Oct 02 13:01:18 crc kubenswrapper[4724]: I1002 13:01:18.566716 4724 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Oct 02 13:01:18 crc kubenswrapper[4724]: I1002 13:01:18.566882 4724 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Oct 02 13:01:18 crc kubenswrapper[4724]: I1002 13:01:18.567106 4724 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-58897d9998-v94bt"] Oct 02 13:01:18 crc kubenswrapper[4724]: I1002 13:01:18.567225 4724 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Oct 02 13:01:18 crc kubenswrapper[4724]: I1002 13:01:18.567326 4724 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Oct 02 13:01:18 crc kubenswrapper[4724]: I1002 13:01:18.567398 4724 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-f9d7485db-lvb24"] Oct 02 13:01:18 crc kubenswrapper[4724]: I1002 13:01:18.567483 4724 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Oct 02 13:01:18 crc kubenswrapper[4724]: I1002 13:01:18.567648 4724 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Oct 02 13:01:18 crc kubenswrapper[4724]: I1002 13:01:18.567658 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-tj5h8" Oct 02 13:01:18 crc kubenswrapper[4724]: I1002 13:01:18.567742 4724 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-jpfxn"] Oct 02 13:01:18 crc kubenswrapper[4724]: I1002 13:01:18.567843 4724 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Oct 02 13:01:18 crc kubenswrapper[4724]: I1002 13:01:18.567937 4724 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Oct 02 13:01:18 crc kubenswrapper[4724]: I1002 13:01:18.567984 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-v94bt" Oct 02 13:01:18 crc kubenswrapper[4724]: I1002 13:01:18.568021 4724 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Oct 02 13:01:18 crc kubenswrapper[4724]: I1002 13:01:18.568114 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-jpfxn" Oct 02 13:01:18 crc kubenswrapper[4724]: I1002 13:01:18.568656 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-lvb24" Oct 02 13:01:18 crc kubenswrapper[4724]: I1002 13:01:18.568842 4724 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Oct 02 13:01:18 crc kubenswrapper[4724]: I1002 13:01:18.568924 4724 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Oct 02 13:01:18 crc kubenswrapper[4724]: I1002 13:01:18.568989 4724 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Oct 02 13:01:18 crc kubenswrapper[4724]: I1002 13:01:18.569054 4724 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Oct 02 13:01:18 crc kubenswrapper[4724]: I1002 13:01:18.569092 4724 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Oct 02 13:01:18 crc kubenswrapper[4724]: I1002 13:01:18.569586 4724 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-t8hf4"] Oct 02 13:01:18 crc kubenswrapper[4724]: I1002 13:01:18.569610 4724 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Oct 02 13:01:18 crc kubenswrapper[4724]: I1002 13:01:18.569764 4724 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Oct 02 13:01:18 crc kubenswrapper[4724]: I1002 13:01:18.569984 4724 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Oct 02 13:01:18 crc kubenswrapper[4724]: I1002 13:01:18.570080 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-t8hf4" Oct 02 13:01:18 crc kubenswrapper[4724]: I1002 13:01:18.570249 4724 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Oct 02 13:01:18 crc kubenswrapper[4724]: I1002 13:01:18.572657 4724 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-w4dqz"] Oct 02 13:01:18 crc kubenswrapper[4724]: I1002 13:01:18.572701 4724 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-v5w5d"] Oct 02 13:01:18 crc kubenswrapper[4724]: I1002 13:01:18.573125 4724 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-qwb55"] Oct 02 13:01:18 crc kubenswrapper[4724]: I1002 13:01:18.573508 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-qwb55" Oct 02 13:01:18 crc kubenswrapper[4724]: I1002 13:01:18.573824 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-v5w5d" Oct 02 13:01:18 crc kubenswrapper[4724]: I1002 13:01:18.579771 4724 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Oct 02 13:01:18 crc kubenswrapper[4724]: I1002 13:01:18.581960 4724 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Oct 02 13:01:18 crc kubenswrapper[4724]: I1002 13:01:18.586738 4724 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Oct 02 13:01:18 crc kubenswrapper[4724]: I1002 13:01:18.587162 4724 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Oct 02 13:01:18 crc kubenswrapper[4724]: I1002 13:01:18.587498 4724 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Oct 02 13:01:18 crc kubenswrapper[4724]: I1002 13:01:18.587702 4724 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Oct 02 13:01:18 crc kubenswrapper[4724]: I1002 13:01:18.587915 4724 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Oct 02 13:01:18 crc kubenswrapper[4724]: I1002 13:01:18.588043 4724 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Oct 02 13:01:18 crc kubenswrapper[4724]: I1002 13:01:18.588176 4724 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Oct 02 13:01:18 crc kubenswrapper[4724]: I1002 13:01:18.588431 4724 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Oct 02 13:01:18 crc kubenswrapper[4724]: I1002 13:01:18.588589 4724 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Oct 02 13:01:18 crc kubenswrapper[4724]: I1002 13:01:18.588730 4724 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Oct 02 13:01:18 crc kubenswrapper[4724]: I1002 13:01:18.588909 4724 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Oct 02 13:01:18 crc kubenswrapper[4724]: I1002 13:01:18.589075 4724 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Oct 02 13:01:18 crc kubenswrapper[4724]: I1002 13:01:18.589224 4724 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Oct 02 13:01:18 crc kubenswrapper[4724]: I1002 13:01:18.592371 4724 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-9fscb"] Oct 02 13:01:18 crc kubenswrapper[4724]: I1002 13:01:18.593734 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-9fscb" Oct 02 13:01:18 crc kubenswrapper[4724]: I1002 13:01:18.597120 4724 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-4lzm2"] Oct 02 13:01:18 crc kubenswrapper[4724]: I1002 13:01:18.598713 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-4lzm2" Oct 02 13:01:18 crc kubenswrapper[4724]: I1002 13:01:18.599926 4724 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-fh5fg"] Oct 02 13:01:18 crc kubenswrapper[4724]: I1002 13:01:18.600508 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-fh5fg" Oct 02 13:01:18 crc kubenswrapper[4724]: I1002 13:01:18.603479 4724 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Oct 02 13:01:18 crc kubenswrapper[4724]: I1002 13:01:18.603748 4724 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Oct 02 13:01:18 crc kubenswrapper[4724]: I1002 13:01:18.603812 4724 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Oct 02 13:01:18 crc kubenswrapper[4724]: I1002 13:01:18.603920 4724 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Oct 02 13:01:18 crc kubenswrapper[4724]: I1002 13:01:18.604046 4724 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Oct 02 13:01:18 crc kubenswrapper[4724]: I1002 13:01:18.606047 4724 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Oct 02 13:01:18 crc kubenswrapper[4724]: I1002 13:01:18.606459 4724 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Oct 02 13:01:18 crc kubenswrapper[4724]: I1002 13:01:18.616506 4724 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Oct 02 13:01:18 crc kubenswrapper[4724]: I1002 13:01:18.616710 4724 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Oct 02 13:01:18 crc kubenswrapper[4724]: I1002 13:01:18.617338 4724 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Oct 02 13:01:18 crc kubenswrapper[4724]: I1002 13:01:18.618506 4724 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Oct 02 13:01:18 crc kubenswrapper[4724]: I1002 13:01:18.620005 4724 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-5444994796-n5rln"] Oct 02 13:01:18 crc kubenswrapper[4724]: I1002 13:01:18.620464 4724 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-hdbqt"] Oct 02 13:01:18 crc kubenswrapper[4724]: I1002 13:01:18.621567 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-n5rln" Oct 02 13:01:18 crc kubenswrapper[4724]: I1002 13:01:18.622395 4724 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Oct 02 13:01:18 crc kubenswrapper[4724]: I1002 13:01:18.623255 4724 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-f2jql"] Oct 02 13:01:18 crc kubenswrapper[4724]: I1002 13:01:18.623896 4724 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-d2rqx"] Oct 02 13:01:18 crc kubenswrapper[4724]: I1002 13:01:18.624025 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-hdbqt" Oct 02 13:01:18 crc kubenswrapper[4724]: I1002 13:01:18.624454 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-d2rqx" Oct 02 13:01:18 crc kubenswrapper[4724]: I1002 13:01:18.624507 4724 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-vg57j"] Oct 02 13:01:18 crc kubenswrapper[4724]: I1002 13:01:18.625171 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-vg57j" Oct 02 13:01:18 crc kubenswrapper[4724]: I1002 13:01:18.626133 4724 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Oct 02 13:01:18 crc kubenswrapper[4724]: I1002 13:01:18.626150 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-f2jql" Oct 02 13:01:18 crc kubenswrapper[4724]: I1002 13:01:18.627996 4724 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-8zmbp"] Oct 02 13:01:18 crc kubenswrapper[4724]: I1002 13:01:18.628692 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-8zmbp" Oct 02 13:01:18 crc kubenswrapper[4724]: I1002 13:01:18.629290 4724 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-xnt99"] Oct 02 13:01:18 crc kubenswrapper[4724]: I1002 13:01:18.634844 4724 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Oct 02 13:01:18 crc kubenswrapper[4724]: I1002 13:01:18.635182 4724 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-klq6x"] Oct 02 13:01:18 crc kubenswrapper[4724]: I1002 13:01:18.635653 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-xnt99" Oct 02 13:01:18 crc kubenswrapper[4724]: I1002 13:01:18.635684 4724 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-m6knf"] Oct 02 13:01:18 crc kubenswrapper[4724]: I1002 13:01:18.635770 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-klq6x" Oct 02 13:01:18 crc kubenswrapper[4724]: I1002 13:01:18.636622 4724 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29323500-vz2xt"] Oct 02 13:01:18 crc kubenswrapper[4724]: I1002 13:01:18.637044 4724 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-wjrm6"] Oct 02 13:01:18 crc kubenswrapper[4724]: I1002 13:01:18.637417 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-m6knf" Oct 02 13:01:18 crc kubenswrapper[4724]: I1002 13:01:18.637513 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-wjrm6" Oct 02 13:01:18 crc kubenswrapper[4724]: I1002 13:01:18.637618 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29323500-vz2xt" Oct 02 13:01:18 crc kubenswrapper[4724]: I1002 13:01:18.640752 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/bbeff7ab-5d85-4709-bd85-22d6e99ff30c-client-ca\") pod \"route-controller-manager-6576b87f9c-qxfvd\" (UID: \"bbeff7ab-5d85-4709-bd85-22d6e99ff30c\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-qxfvd" Oct 02 13:01:18 crc kubenswrapper[4724]: I1002 13:01:18.640797 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zrb6x\" (UniqueName: \"kubernetes.io/projected/84a3f9d9-8a1f-45ed-ad6f-3c8eb02738e7-kube-api-access-zrb6x\") pod \"downloads-7954f5f757-kl4xt\" (UID: \"84a3f9d9-8a1f-45ed-ad6f-3c8eb02738e7\") " pod="openshift-console/downloads-7954f5f757-kl4xt" Oct 02 13:01:18 crc kubenswrapper[4724]: I1002 13:01:18.640831 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/da7051e3-8a79-43e7-9016-9d492b51a9fd-trusted-ca-bundle\") pod \"apiserver-76f77b778f-xjrx9\" (UID: \"da7051e3-8a79-43e7-9016-9d492b51a9fd\") " pod="openshift-apiserver/apiserver-76f77b778f-xjrx9" Oct 02 13:01:18 crc kubenswrapper[4724]: I1002 13:01:18.640860 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/7aab6527-d135-45a0-8fe0-99de1fd40d3d-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-m8t6v\" (UID: \"7aab6527-d135-45a0-8fe0-99de1fd40d3d\") " pod="openshift-authentication/oauth-openshift-558db77b4-m8t6v" Oct 02 13:01:18 crc kubenswrapper[4724]: I1002 13:01:18.640889 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/da7051e3-8a79-43e7-9016-9d492b51a9fd-config\") pod \"apiserver-76f77b778f-xjrx9\" (UID: \"da7051e3-8a79-43e7-9016-9d492b51a9fd\") " pod="openshift-apiserver/apiserver-76f77b778f-xjrx9" Oct 02 13:01:18 crc kubenswrapper[4724]: I1002 13:01:18.640914 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/9699ff16-3d72-4ba6-9055-6b707c3e223f-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-zpmh7\" (UID: \"9699ff16-3d72-4ba6-9055-6b707c3e223f\") " pod="openshift-controller-manager/controller-manager-879f6c89f-zpmh7" Oct 02 13:01:18 crc kubenswrapper[4724]: I1002 13:01:18.640937 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e0f09ca7-146f-44c3-9137-76cff079b5bc-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-st27v\" (UID: \"e0f09ca7-146f-44c3-9137-76cff079b5bc\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-st27v" Oct 02 13:01:18 crc kubenswrapper[4724]: I1002 13:01:18.640965 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/7aab6527-d135-45a0-8fe0-99de1fd40d3d-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-m8t6v\" (UID: \"7aab6527-d135-45a0-8fe0-99de1fd40d3d\") " pod="openshift-authentication/oauth-openshift-558db77b4-m8t6v" Oct 02 13:01:18 crc kubenswrapper[4724]: I1002 13:01:18.641001 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/486adec0-0d45-4294-ac1b-5fff0dda6602-auth-proxy-config\") pod \"machine-approver-56656f9798-zl9kr\" (UID: \"486adec0-0d45-4294-ac1b-5fff0dda6602\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-zl9kr" Oct 02 13:01:18 crc kubenswrapper[4724]: I1002 13:01:18.641029 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/4f4b9ef0-2fa1-4d48-81c9-e428e93c7034-available-featuregates\") pod \"openshift-config-operator-7777fb866f-974gc\" (UID: \"4f4b9ef0-2fa1-4d48-81c9-e428e93c7034\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-974gc" Oct 02 13:01:18 crc kubenswrapper[4724]: I1002 13:01:18.641058 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b8bl8\" (UniqueName: \"kubernetes.io/projected/da7051e3-8a79-43e7-9016-9d492b51a9fd-kube-api-access-b8bl8\") pod \"apiserver-76f77b778f-xjrx9\" (UID: \"da7051e3-8a79-43e7-9016-9d492b51a9fd\") " pod="openshift-apiserver/apiserver-76f77b778f-xjrx9" Oct 02 13:01:18 crc kubenswrapper[4724]: I1002 13:01:18.641082 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wjrjd\" (UniqueName: \"kubernetes.io/projected/4f4b9ef0-2fa1-4d48-81c9-e428e93c7034-kube-api-access-wjrjd\") pod \"openshift-config-operator-7777fb866f-974gc\" (UID: \"4f4b9ef0-2fa1-4d48-81c9-e428e93c7034\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-974gc" Oct 02 13:01:18 crc kubenswrapper[4724]: I1002 13:01:18.641107 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xn64z\" (UniqueName: \"kubernetes.io/projected/9699ff16-3d72-4ba6-9055-6b707c3e223f-kube-api-access-xn64z\") pod \"controller-manager-879f6c89f-zpmh7\" (UID: \"9699ff16-3d72-4ba6-9055-6b707c3e223f\") " pod="openshift-controller-manager/controller-manager-879f6c89f-zpmh7" Oct 02 13:01:18 crc kubenswrapper[4724]: I1002 13:01:18.641132 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-szdn2\" (UniqueName: \"kubernetes.io/projected/7aab6527-d135-45a0-8fe0-99de1fd40d3d-kube-api-access-szdn2\") pod \"oauth-openshift-558db77b4-m8t6v\" (UID: \"7aab6527-d135-45a0-8fe0-99de1fd40d3d\") " pod="openshift-authentication/oauth-openshift-558db77b4-m8t6v" Oct 02 13:01:18 crc kubenswrapper[4724]: I1002 13:01:18.641158 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e0f09ca7-146f-44c3-9137-76cff079b5bc-service-ca-bundle\") pod \"authentication-operator-69f744f599-st27v\" (UID: \"e0f09ca7-146f-44c3-9137-76cff079b5bc\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-st27v" Oct 02 13:01:18 crc kubenswrapper[4724]: I1002 13:01:18.641180 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bbeff7ab-5d85-4709-bd85-22d6e99ff30c-config\") pod \"route-controller-manager-6576b87f9c-qxfvd\" (UID: \"bbeff7ab-5d85-4709-bd85-22d6e99ff30c\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-qxfvd" Oct 02 13:01:18 crc kubenswrapper[4724]: I1002 13:01:18.641202 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/7aab6527-d135-45a0-8fe0-99de1fd40d3d-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-m8t6v\" (UID: \"7aab6527-d135-45a0-8fe0-99de1fd40d3d\") " pod="openshift-authentication/oauth-openshift-558db77b4-m8t6v" Oct 02 13:01:18 crc kubenswrapper[4724]: I1002 13:01:18.641225 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/da7051e3-8a79-43e7-9016-9d492b51a9fd-audit\") pod \"apiserver-76f77b778f-xjrx9\" (UID: \"da7051e3-8a79-43e7-9016-9d492b51a9fd\") " pod="openshift-apiserver/apiserver-76f77b778f-xjrx9" Oct 02 13:01:18 crc kubenswrapper[4724]: I1002 13:01:18.641250 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/27c39377-9fc1-4ca9-8ce4-8a1c61f181c0-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-x726v\" (UID: \"27c39377-9fc1-4ca9-8ce4-8a1c61f181c0\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-x726v" Oct 02 13:01:18 crc kubenswrapper[4724]: I1002 13:01:18.641272 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5gg2l\" (UniqueName: \"kubernetes.io/projected/27c39377-9fc1-4ca9-8ce4-8a1c61f181c0-kube-api-access-5gg2l\") pod \"machine-api-operator-5694c8668f-x726v\" (UID: \"27c39377-9fc1-4ca9-8ce4-8a1c61f181c0\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-x726v" Oct 02 13:01:18 crc kubenswrapper[4724]: I1002 13:01:18.641294 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mrq7m\" (UniqueName: \"kubernetes.io/projected/486adec0-0d45-4294-ac1b-5fff0dda6602-kube-api-access-mrq7m\") pod \"machine-approver-56656f9798-zl9kr\" (UID: \"486adec0-0d45-4294-ac1b-5fff0dda6602\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-zl9kr" Oct 02 13:01:18 crc kubenswrapper[4724]: I1002 13:01:18.641314 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e0f09ca7-146f-44c3-9137-76cff079b5bc-config\") pod \"authentication-operator-69f744f599-st27v\" (UID: \"e0f09ca7-146f-44c3-9137-76cff079b5bc\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-st27v" Oct 02 13:01:18 crc kubenswrapper[4724]: I1002 13:01:18.641337 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/4501d69c-964b-4444-b8af-d56b9301a685-audit-policies\") pod \"apiserver-7bbb656c7d-n4xw8\" (UID: \"4501d69c-964b-4444-b8af-d56b9301a685\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-n4xw8" Oct 02 13:01:18 crc kubenswrapper[4724]: I1002 13:01:18.641365 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c8k7f\" (UniqueName: \"kubernetes.io/projected/bbeff7ab-5d85-4709-bd85-22d6e99ff30c-kube-api-access-c8k7f\") pod \"route-controller-manager-6576b87f9c-qxfvd\" (UID: \"bbeff7ab-5d85-4709-bd85-22d6e99ff30c\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-qxfvd" Oct 02 13:01:18 crc kubenswrapper[4724]: I1002 13:01:18.641391 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/7aab6527-d135-45a0-8fe0-99de1fd40d3d-audit-dir\") pod \"oauth-openshift-558db77b4-m8t6v\" (UID: \"7aab6527-d135-45a0-8fe0-99de1fd40d3d\") " pod="openshift-authentication/oauth-openshift-558db77b4-m8t6v" Oct 02 13:01:18 crc kubenswrapper[4724]: I1002 13:01:18.641415 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/7aab6527-d135-45a0-8fe0-99de1fd40d3d-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-m8t6v\" (UID: \"7aab6527-d135-45a0-8fe0-99de1fd40d3d\") " pod="openshift-authentication/oauth-openshift-558db77b4-m8t6v" Oct 02 13:01:18 crc kubenswrapper[4724]: I1002 13:01:18.641484 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/7aab6527-d135-45a0-8fe0-99de1fd40d3d-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-m8t6v\" (UID: \"7aab6527-d135-45a0-8fe0-99de1fd40d3d\") " pod="openshift-authentication/oauth-openshift-558db77b4-m8t6v" Oct 02 13:01:18 crc kubenswrapper[4724]: I1002 13:01:18.641509 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bbeff7ab-5d85-4709-bd85-22d6e99ff30c-serving-cert\") pod \"route-controller-manager-6576b87f9c-qxfvd\" (UID: \"bbeff7ab-5d85-4709-bd85-22d6e99ff30c\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-qxfvd" Oct 02 13:01:18 crc kubenswrapper[4724]: I1002 13:01:18.641546 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/27c39377-9fc1-4ca9-8ce4-8a1c61f181c0-config\") pod \"machine-api-operator-5694c8668f-x726v\" (UID: \"27c39377-9fc1-4ca9-8ce4-8a1c61f181c0\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-x726v" Oct 02 13:01:18 crc kubenswrapper[4724]: I1002 13:01:18.641572 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/da7051e3-8a79-43e7-9016-9d492b51a9fd-node-pullsecrets\") pod \"apiserver-76f77b778f-xjrx9\" (UID: \"da7051e3-8a79-43e7-9016-9d492b51a9fd\") " pod="openshift-apiserver/apiserver-76f77b778f-xjrx9" Oct 02 13:01:18 crc kubenswrapper[4724]: I1002 13:01:18.641596 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/27c39377-9fc1-4ca9-8ce4-8a1c61f181c0-images\") pod \"machine-api-operator-5694c8668f-x726v\" (UID: \"27c39377-9fc1-4ca9-8ce4-8a1c61f181c0\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-x726v" Oct 02 13:01:18 crc kubenswrapper[4724]: I1002 13:01:18.641618 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4f4b9ef0-2fa1-4d48-81c9-e428e93c7034-serving-cert\") pod \"openshift-config-operator-7777fb866f-974gc\" (UID: \"4f4b9ef0-2fa1-4d48-81c9-e428e93c7034\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-974gc" Oct 02 13:01:18 crc kubenswrapper[4724]: I1002 13:01:18.641644 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/da7051e3-8a79-43e7-9016-9d492b51a9fd-encryption-config\") pod \"apiserver-76f77b778f-xjrx9\" (UID: \"da7051e3-8a79-43e7-9016-9d492b51a9fd\") " pod="openshift-apiserver/apiserver-76f77b778f-xjrx9" Oct 02 13:01:18 crc kubenswrapper[4724]: I1002 13:01:18.641664 4724 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-qbbfj"] Oct 02 13:01:18 crc kubenswrapper[4724]: I1002 13:01:18.641673 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9699ff16-3d72-4ba6-9055-6b707c3e223f-serving-cert\") pod \"controller-manager-879f6c89f-zpmh7\" (UID: \"9699ff16-3d72-4ba6-9055-6b707c3e223f\") " pod="openshift-controller-manager/controller-manager-879f6c89f-zpmh7" Oct 02 13:01:18 crc kubenswrapper[4724]: I1002 13:01:18.641708 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/4501d69c-964b-4444-b8af-d56b9301a685-etcd-client\") pod \"apiserver-7bbb656c7d-n4xw8\" (UID: \"4501d69c-964b-4444-b8af-d56b9301a685\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-n4xw8" Oct 02 13:01:18 crc kubenswrapper[4724]: I1002 13:01:18.641732 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-48p5f\" (UniqueName: \"kubernetes.io/projected/4501d69c-964b-4444-b8af-d56b9301a685-kube-api-access-48p5f\") pod \"apiserver-7bbb656c7d-n4xw8\" (UID: \"4501d69c-964b-4444-b8af-d56b9301a685\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-n4xw8" Oct 02 13:01:18 crc kubenswrapper[4724]: I1002 13:01:18.641756 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/7aab6527-d135-45a0-8fe0-99de1fd40d3d-audit-policies\") pod \"oauth-openshift-558db77b4-m8t6v\" (UID: \"7aab6527-d135-45a0-8fe0-99de1fd40d3d\") " pod="openshift-authentication/oauth-openshift-558db77b4-m8t6v" Oct 02 13:01:18 crc kubenswrapper[4724]: I1002 13:01:18.641782 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9699ff16-3d72-4ba6-9055-6b707c3e223f-config\") pod \"controller-manager-879f6c89f-zpmh7\" (UID: \"9699ff16-3d72-4ba6-9055-6b707c3e223f\") " pod="openshift-controller-manager/controller-manager-879f6c89f-zpmh7" Oct 02 13:01:18 crc kubenswrapper[4724]: I1002 13:01:18.641805 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/9699ff16-3d72-4ba6-9055-6b707c3e223f-client-ca\") pod \"controller-manager-879f6c89f-zpmh7\" (UID: \"9699ff16-3d72-4ba6-9055-6b707c3e223f\") " pod="openshift-controller-manager/controller-manager-879f6c89f-zpmh7" Oct 02 13:01:18 crc kubenswrapper[4724]: I1002 13:01:18.641817 4724 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Oct 02 13:01:18 crc kubenswrapper[4724]: I1002 13:01:18.641830 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/7aab6527-d135-45a0-8fe0-99de1fd40d3d-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-m8t6v\" (UID: \"7aab6527-d135-45a0-8fe0-99de1fd40d3d\") " pod="openshift-authentication/oauth-openshift-558db77b4-m8t6v" Oct 02 13:01:18 crc kubenswrapper[4724]: I1002 13:01:18.642077 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e0f09ca7-146f-44c3-9137-76cff079b5bc-serving-cert\") pod \"authentication-operator-69f744f599-st27v\" (UID: \"e0f09ca7-146f-44c3-9137-76cff079b5bc\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-st27v" Oct 02 13:01:18 crc kubenswrapper[4724]: I1002 13:01:18.642123 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/7aab6527-d135-45a0-8fe0-99de1fd40d3d-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-m8t6v\" (UID: \"7aab6527-d135-45a0-8fe0-99de1fd40d3d\") " pod="openshift-authentication/oauth-openshift-558db77b4-m8t6v" Oct 02 13:01:18 crc kubenswrapper[4724]: I1002 13:01:18.642153 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/da7051e3-8a79-43e7-9016-9d492b51a9fd-serving-cert\") pod \"apiserver-76f77b778f-xjrx9\" (UID: \"da7051e3-8a79-43e7-9016-9d492b51a9fd\") " pod="openshift-apiserver/apiserver-76f77b778f-xjrx9" Oct 02 13:01:18 crc kubenswrapper[4724]: I1002 13:01:18.642176 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7aab6527-d135-45a0-8fe0-99de1fd40d3d-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-m8t6v\" (UID: \"7aab6527-d135-45a0-8fe0-99de1fd40d3d\") " pod="openshift-authentication/oauth-openshift-558db77b4-m8t6v" Oct 02 13:01:18 crc kubenswrapper[4724]: I1002 13:01:18.642201 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4501d69c-964b-4444-b8af-d56b9301a685-serving-cert\") pod \"apiserver-7bbb656c7d-n4xw8\" (UID: \"4501d69c-964b-4444-b8af-d56b9301a685\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-n4xw8" Oct 02 13:01:18 crc kubenswrapper[4724]: I1002 13:01:18.642225 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/4501d69c-964b-4444-b8af-d56b9301a685-audit-dir\") pod \"apiserver-7bbb656c7d-n4xw8\" (UID: \"4501d69c-964b-4444-b8af-d56b9301a685\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-n4xw8" Oct 02 13:01:18 crc kubenswrapper[4724]: I1002 13:01:18.642250 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mr526\" (UniqueName: \"kubernetes.io/projected/e0f09ca7-146f-44c3-9137-76cff079b5bc-kube-api-access-mr526\") pod \"authentication-operator-69f744f599-st27v\" (UID: \"e0f09ca7-146f-44c3-9137-76cff079b5bc\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-st27v" Oct 02 13:01:18 crc kubenswrapper[4724]: I1002 13:01:18.642276 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/da7051e3-8a79-43e7-9016-9d492b51a9fd-image-import-ca\") pod \"apiserver-76f77b778f-xjrx9\" (UID: \"da7051e3-8a79-43e7-9016-9d492b51a9fd\") " pod="openshift-apiserver/apiserver-76f77b778f-xjrx9" Oct 02 13:01:18 crc kubenswrapper[4724]: I1002 13:01:18.642296 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/4501d69c-964b-4444-b8af-d56b9301a685-encryption-config\") pod \"apiserver-7bbb656c7d-n4xw8\" (UID: \"4501d69c-964b-4444-b8af-d56b9301a685\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-n4xw8" Oct 02 13:01:18 crc kubenswrapper[4724]: I1002 13:01:18.642334 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/4501d69c-964b-4444-b8af-d56b9301a685-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-n4xw8\" (UID: \"4501d69c-964b-4444-b8af-d56b9301a685\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-n4xw8" Oct 02 13:01:18 crc kubenswrapper[4724]: I1002 13:01:18.642359 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/da7051e3-8a79-43e7-9016-9d492b51a9fd-audit-dir\") pod \"apiserver-76f77b778f-xjrx9\" (UID: \"da7051e3-8a79-43e7-9016-9d492b51a9fd\") " pod="openshift-apiserver/apiserver-76f77b778f-xjrx9" Oct 02 13:01:18 crc kubenswrapper[4724]: I1002 13:01:18.642375 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/486adec0-0d45-4294-ac1b-5fff0dda6602-machine-approver-tls\") pod \"machine-approver-56656f9798-zl9kr\" (UID: \"486adec0-0d45-4294-ac1b-5fff0dda6602\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-zl9kr" Oct 02 13:01:18 crc kubenswrapper[4724]: I1002 13:01:18.642391 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/da7051e3-8a79-43e7-9016-9d492b51a9fd-etcd-client\") pod \"apiserver-76f77b778f-xjrx9\" (UID: \"da7051e3-8a79-43e7-9016-9d492b51a9fd\") " pod="openshift-apiserver/apiserver-76f77b778f-xjrx9" Oct 02 13:01:18 crc kubenswrapper[4724]: I1002 13:01:18.642408 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/da7051e3-8a79-43e7-9016-9d492b51a9fd-etcd-serving-ca\") pod \"apiserver-76f77b778f-xjrx9\" (UID: \"da7051e3-8a79-43e7-9016-9d492b51a9fd\") " pod="openshift-apiserver/apiserver-76f77b778f-xjrx9" Oct 02 13:01:18 crc kubenswrapper[4724]: I1002 13:01:18.642426 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/7aab6527-d135-45a0-8fe0-99de1fd40d3d-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-m8t6v\" (UID: \"7aab6527-d135-45a0-8fe0-99de1fd40d3d\") " pod="openshift-authentication/oauth-openshift-558db77b4-m8t6v" Oct 02 13:01:18 crc kubenswrapper[4724]: I1002 13:01:18.642443 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/7aab6527-d135-45a0-8fe0-99de1fd40d3d-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-m8t6v\" (UID: \"7aab6527-d135-45a0-8fe0-99de1fd40d3d\") " pod="openshift-authentication/oauth-openshift-558db77b4-m8t6v" Oct 02 13:01:18 crc kubenswrapper[4724]: I1002 13:01:18.642463 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4501d69c-964b-4444-b8af-d56b9301a685-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-n4xw8\" (UID: \"4501d69c-964b-4444-b8af-d56b9301a685\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-n4xw8" Oct 02 13:01:18 crc kubenswrapper[4724]: I1002 13:01:18.642482 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/486adec0-0d45-4294-ac1b-5fff0dda6602-config\") pod \"machine-approver-56656f9798-zl9kr\" (UID: \"486adec0-0d45-4294-ac1b-5fff0dda6602\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-zl9kr" Oct 02 13:01:18 crc kubenswrapper[4724]: I1002 13:01:18.642502 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/7aab6527-d135-45a0-8fe0-99de1fd40d3d-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-m8t6v\" (UID: \"7aab6527-d135-45a0-8fe0-99de1fd40d3d\") " pod="openshift-authentication/oauth-openshift-558db77b4-m8t6v" Oct 02 13:01:18 crc kubenswrapper[4724]: I1002 13:01:18.642584 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/da7051e3-8a79-43e7-9016-9d492b51a9fd-config\") pod \"apiserver-76f77b778f-xjrx9\" (UID: \"da7051e3-8a79-43e7-9016-9d492b51a9fd\") " pod="openshift-apiserver/apiserver-76f77b778f-xjrx9" Oct 02 13:01:18 crc kubenswrapper[4724]: I1002 13:01:18.643640 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/bbeff7ab-5d85-4709-bd85-22d6e99ff30c-client-ca\") pod \"route-controller-manager-6576b87f9c-qxfvd\" (UID: \"bbeff7ab-5d85-4709-bd85-22d6e99ff30c\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-qxfvd" Oct 02 13:01:18 crc kubenswrapper[4724]: I1002 13:01:18.643922 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/da7051e3-8a79-43e7-9016-9d492b51a9fd-trusted-ca-bundle\") pod \"apiserver-76f77b778f-xjrx9\" (UID: \"da7051e3-8a79-43e7-9016-9d492b51a9fd\") " pod="openshift-apiserver/apiserver-76f77b778f-xjrx9" Oct 02 13:01:18 crc kubenswrapper[4724]: I1002 13:01:18.644030 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e0f09ca7-146f-44c3-9137-76cff079b5bc-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-st27v\" (UID: \"e0f09ca7-146f-44c3-9137-76cff079b5bc\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-st27v" Oct 02 13:01:18 crc kubenswrapper[4724]: I1002 13:01:18.645337 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/486adec0-0d45-4294-ac1b-5fff0dda6602-auth-proxy-config\") pod \"machine-approver-56656f9798-zl9kr\" (UID: \"486adec0-0d45-4294-ac1b-5fff0dda6602\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-zl9kr" Oct 02 13:01:18 crc kubenswrapper[4724]: I1002 13:01:18.645514 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bbeff7ab-5d85-4709-bd85-22d6e99ff30c-config\") pod \"route-controller-manager-6576b87f9c-qxfvd\" (UID: \"bbeff7ab-5d85-4709-bd85-22d6e99ff30c\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-qxfvd" Oct 02 13:01:18 crc kubenswrapper[4724]: I1002 13:01:18.645675 4724 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-x726v"] Oct 02 13:01:18 crc kubenswrapper[4724]: I1002 13:01:18.645710 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/4f4b9ef0-2fa1-4d48-81c9-e428e93c7034-available-featuregates\") pod \"openshift-config-operator-7777fb866f-974gc\" (UID: \"4f4b9ef0-2fa1-4d48-81c9-e428e93c7034\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-974gc" Oct 02 13:01:18 crc kubenswrapper[4724]: I1002 13:01:18.646061 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/da7051e3-8a79-43e7-9016-9d492b51a9fd-audit\") pod \"apiserver-76f77b778f-xjrx9\" (UID: \"da7051e3-8a79-43e7-9016-9d492b51a9fd\") " pod="openshift-apiserver/apiserver-76f77b778f-xjrx9" Oct 02 13:01:18 crc kubenswrapper[4724]: I1002 13:01:18.646630 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/9699ff16-3d72-4ba6-9055-6b707c3e223f-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-zpmh7\" (UID: \"9699ff16-3d72-4ba6-9055-6b707c3e223f\") " pod="openshift-controller-manager/controller-manager-879f6c89f-zpmh7" Oct 02 13:01:18 crc kubenswrapper[4724]: I1002 13:01:18.646630 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/da7051e3-8a79-43e7-9016-9d492b51a9fd-node-pullsecrets\") pod \"apiserver-76f77b778f-xjrx9\" (UID: \"da7051e3-8a79-43e7-9016-9d492b51a9fd\") " pod="openshift-apiserver/apiserver-76f77b778f-xjrx9" Oct 02 13:01:18 crc kubenswrapper[4724]: I1002 13:01:18.646863 4724 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-j7cp6"] Oct 02 13:01:18 crc kubenswrapper[4724]: I1002 13:01:18.647122 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/4501d69c-964b-4444-b8af-d56b9301a685-audit-policies\") pod \"apiserver-7bbb656c7d-n4xw8\" (UID: \"4501d69c-964b-4444-b8af-d56b9301a685\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-n4xw8" Oct 02 13:01:18 crc kubenswrapper[4724]: I1002 13:01:18.647252 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/da7051e3-8a79-43e7-9016-9d492b51a9fd-audit-dir\") pod \"apiserver-76f77b778f-xjrx9\" (UID: \"da7051e3-8a79-43e7-9016-9d492b51a9fd\") " pod="openshift-apiserver/apiserver-76f77b778f-xjrx9" Oct 02 13:01:18 crc kubenswrapper[4724]: I1002 13:01:18.647475 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/27c39377-9fc1-4ca9-8ce4-8a1c61f181c0-config\") pod \"machine-api-operator-5694c8668f-x726v\" (UID: \"27c39377-9fc1-4ca9-8ce4-8a1c61f181c0\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-x726v" Oct 02 13:01:18 crc kubenswrapper[4724]: I1002 13:01:18.648101 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/4501d69c-964b-4444-b8af-d56b9301a685-audit-dir\") pod \"apiserver-7bbb656c7d-n4xw8\" (UID: \"4501d69c-964b-4444-b8af-d56b9301a685\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-n4xw8" Oct 02 13:01:18 crc kubenswrapper[4724]: I1002 13:01:18.648845 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-j7cp6" Oct 02 13:01:18 crc kubenswrapper[4724]: I1002 13:01:18.649351 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9699ff16-3d72-4ba6-9055-6b707c3e223f-config\") pod \"controller-manager-879f6c89f-zpmh7\" (UID: \"9699ff16-3d72-4ba6-9055-6b707c3e223f\") " pod="openshift-controller-manager/controller-manager-879f6c89f-zpmh7" Oct 02 13:01:18 crc kubenswrapper[4724]: I1002 13:01:18.649682 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/da7051e3-8a79-43e7-9016-9d492b51a9fd-image-import-ca\") pod \"apiserver-76f77b778f-xjrx9\" (UID: \"da7051e3-8a79-43e7-9016-9d492b51a9fd\") " pod="openshift-apiserver/apiserver-76f77b778f-xjrx9" Oct 02 13:01:18 crc kubenswrapper[4724]: I1002 13:01:18.654978 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/da7051e3-8a79-43e7-9016-9d492b51a9fd-serving-cert\") pod \"apiserver-76f77b778f-xjrx9\" (UID: \"da7051e3-8a79-43e7-9016-9d492b51a9fd\") " pod="openshift-apiserver/apiserver-76f77b778f-xjrx9" Oct 02 13:01:18 crc kubenswrapper[4724]: I1002 13:01:18.655621 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/da7051e3-8a79-43e7-9016-9d492b51a9fd-etcd-client\") pod \"apiserver-76f77b778f-xjrx9\" (UID: \"da7051e3-8a79-43e7-9016-9d492b51a9fd\") " pod="openshift-apiserver/apiserver-76f77b778f-xjrx9" Oct 02 13:01:18 crc kubenswrapper[4724]: I1002 13:01:18.655999 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/486adec0-0d45-4294-ac1b-5fff0dda6602-machine-approver-tls\") pod \"machine-approver-56656f9798-zl9kr\" (UID: \"486adec0-0d45-4294-ac1b-5fff0dda6602\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-zl9kr" Oct 02 13:01:18 crc kubenswrapper[4724]: I1002 13:01:18.656063 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4501d69c-964b-4444-b8af-d56b9301a685-serving-cert\") pod \"apiserver-7bbb656c7d-n4xw8\" (UID: \"4501d69c-964b-4444-b8af-d56b9301a685\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-n4xw8" Oct 02 13:01:18 crc kubenswrapper[4724]: I1002 13:01:18.656321 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/27c39377-9fc1-4ca9-8ce4-8a1c61f181c0-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-x726v\" (UID: \"27c39377-9fc1-4ca9-8ce4-8a1c61f181c0\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-x726v" Oct 02 13:01:18 crc kubenswrapper[4724]: I1002 13:01:18.656637 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/da7051e3-8a79-43e7-9016-9d492b51a9fd-encryption-config\") pod \"apiserver-76f77b778f-xjrx9\" (UID: \"da7051e3-8a79-43e7-9016-9d492b51a9fd\") " pod="openshift-apiserver/apiserver-76f77b778f-xjrx9" Oct 02 13:01:18 crc kubenswrapper[4724]: I1002 13:01:18.657130 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e0f09ca7-146f-44c3-9137-76cff079b5bc-config\") pod \"authentication-operator-69f744f599-st27v\" (UID: \"e0f09ca7-146f-44c3-9137-76cff079b5bc\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-st27v" Oct 02 13:01:18 crc kubenswrapper[4724]: I1002 13:01:18.658428 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4501d69c-964b-4444-b8af-d56b9301a685-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-n4xw8\" (UID: \"4501d69c-964b-4444-b8af-d56b9301a685\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-n4xw8" Oct 02 13:01:18 crc kubenswrapper[4724]: I1002 13:01:18.658826 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/486adec0-0d45-4294-ac1b-5fff0dda6602-config\") pod \"machine-approver-56656f9798-zl9kr\" (UID: \"486adec0-0d45-4294-ac1b-5fff0dda6602\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-zl9kr" Oct 02 13:01:18 crc kubenswrapper[4724]: I1002 13:01:18.642485 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-qbbfj" Oct 02 13:01:18 crc kubenswrapper[4724]: I1002 13:01:18.673515 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bbeff7ab-5d85-4709-bd85-22d6e99ff30c-serving-cert\") pod \"route-controller-manager-6576b87f9c-qxfvd\" (UID: \"bbeff7ab-5d85-4709-bd85-22d6e99ff30c\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-qxfvd" Oct 02 13:01:18 crc kubenswrapper[4724]: I1002 13:01:18.674958 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/4501d69c-964b-4444-b8af-d56b9301a685-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-n4xw8\" (UID: \"4501d69c-964b-4444-b8af-d56b9301a685\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-n4xw8" Oct 02 13:01:18 crc kubenswrapper[4724]: I1002 13:01:18.675626 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4f4b9ef0-2fa1-4d48-81c9-e428e93c7034-serving-cert\") pod \"openshift-config-operator-7777fb866f-974gc\" (UID: \"4f4b9ef0-2fa1-4d48-81c9-e428e93c7034\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-974gc" Oct 02 13:01:18 crc kubenswrapper[4724]: I1002 13:01:18.675661 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e0f09ca7-146f-44c3-9137-76cff079b5bc-service-ca-bundle\") pod \"authentication-operator-69f744f599-st27v\" (UID: \"e0f09ca7-146f-44c3-9137-76cff079b5bc\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-st27v" Oct 02 13:01:18 crc kubenswrapper[4724]: I1002 13:01:18.676035 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/4501d69c-964b-4444-b8af-d56b9301a685-etcd-client\") pod \"apiserver-7bbb656c7d-n4xw8\" (UID: \"4501d69c-964b-4444-b8af-d56b9301a685\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-n4xw8" Oct 02 13:01:18 crc kubenswrapper[4724]: I1002 13:01:18.676217 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/da7051e3-8a79-43e7-9016-9d492b51a9fd-etcd-serving-ca\") pod \"apiserver-76f77b778f-xjrx9\" (UID: \"da7051e3-8a79-43e7-9016-9d492b51a9fd\") " pod="openshift-apiserver/apiserver-76f77b778f-xjrx9" Oct 02 13:01:18 crc kubenswrapper[4724]: I1002 13:01:18.677351 4724 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-st27v"] Oct 02 13:01:18 crc kubenswrapper[4724]: I1002 13:01:18.677737 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/9699ff16-3d72-4ba6-9055-6b707c3e223f-client-ca\") pod \"controller-manager-879f6c89f-zpmh7\" (UID: \"9699ff16-3d72-4ba6-9055-6b707c3e223f\") " pod="openshift-controller-manager/controller-manager-879f6c89f-zpmh7" Oct 02 13:01:18 crc kubenswrapper[4724]: I1002 13:01:18.678217 4724 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Oct 02 13:01:18 crc kubenswrapper[4724]: I1002 13:01:18.678658 4724 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Oct 02 13:01:18 crc kubenswrapper[4724]: I1002 13:01:18.679618 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9699ff16-3d72-4ba6-9055-6b707c3e223f-serving-cert\") pod \"controller-manager-879f6c89f-zpmh7\" (UID: \"9699ff16-3d72-4ba6-9055-6b707c3e223f\") " pod="openshift-controller-manager/controller-manager-879f6c89f-zpmh7" Oct 02 13:01:18 crc kubenswrapper[4724]: I1002 13:01:18.688815 4724 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Oct 02 13:01:18 crc kubenswrapper[4724]: I1002 13:01:18.690398 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/4501d69c-964b-4444-b8af-d56b9301a685-encryption-config\") pod \"apiserver-7bbb656c7d-n4xw8\" (UID: \"4501d69c-964b-4444-b8af-d56b9301a685\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-n4xw8" Oct 02 13:01:18 crc kubenswrapper[4724]: I1002 13:01:18.690524 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e0f09ca7-146f-44c3-9137-76cff079b5bc-serving-cert\") pod \"authentication-operator-69f744f599-st27v\" (UID: \"e0f09ca7-146f-44c3-9137-76cff079b5bc\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-st27v" Oct 02 13:01:18 crc kubenswrapper[4724]: I1002 13:01:18.701607 4724 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-4rwfr"] Oct 02 13:01:18 crc kubenswrapper[4724]: I1002 13:01:18.703589 4724 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-xjrx9"] Oct 02 13:01:18 crc kubenswrapper[4724]: I1002 13:01:18.703655 4724 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-5mxg5"] Oct 02 13:01:18 crc kubenswrapper[4724]: I1002 13:01:18.704271 4724 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-z6hlf"] Oct 02 13:01:18 crc kubenswrapper[4724]: I1002 13:01:18.704755 4724 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-tj5h8"] Oct 02 13:01:18 crc kubenswrapper[4724]: I1002 13:01:18.704778 4724 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-cwvv5"] Oct 02 13:01:18 crc kubenswrapper[4724]: I1002 13:01:18.705561 4724 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-kl4xt"] Oct 02 13:01:18 crc kubenswrapper[4724]: I1002 13:01:18.705583 4724 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-tq9w6"] Oct 02 13:01:18 crc kubenswrapper[4724]: I1002 13:01:18.706226 4724 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-m8t6v"] Oct 02 13:01:18 crc kubenswrapper[4724]: I1002 13:01:18.706248 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-5mxg5" Oct 02 13:01:18 crc kubenswrapper[4724]: I1002 13:01:18.706337 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-tq9w6" Oct 02 13:01:18 crc kubenswrapper[4724]: I1002 13:01:18.706487 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-4rwfr" Oct 02 13:01:18 crc kubenswrapper[4724]: I1002 13:01:18.706255 4724 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-qxfvd"] Oct 02 13:01:18 crc kubenswrapper[4724]: I1002 13:01:18.706657 4724 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-fh5fg"] Oct 02 13:01:18 crc kubenswrapper[4724]: I1002 13:01:18.706668 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-z6hlf" Oct 02 13:01:18 crc kubenswrapper[4724]: I1002 13:01:18.706671 4724 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-974gc"] Oct 02 13:01:18 crc kubenswrapper[4724]: I1002 13:01:18.706853 4724 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-lvb24"] Oct 02 13:01:18 crc kubenswrapper[4724]: I1002 13:01:18.706865 4724 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-v94bt"] Oct 02 13:01:18 crc kubenswrapper[4724]: I1002 13:01:18.706941 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-cwvv5" Oct 02 13:01:18 crc kubenswrapper[4724]: I1002 13:01:18.707373 4724 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-n4xw8"] Oct 02 13:01:18 crc kubenswrapper[4724]: I1002 13:01:18.708407 4724 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-zpmh7"] Oct 02 13:01:18 crc kubenswrapper[4724]: I1002 13:01:18.708661 4724 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Oct 02 13:01:18 crc kubenswrapper[4724]: I1002 13:01:18.709429 4724 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-qwb55"] Oct 02 13:01:18 crc kubenswrapper[4724]: I1002 13:01:18.710455 4724 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-d2rqx"] Oct 02 13:01:18 crc kubenswrapper[4724]: I1002 13:01:18.711483 4724 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-qbbfj"] Oct 02 13:01:18 crc kubenswrapper[4724]: I1002 13:01:18.712587 4724 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-9fscb"] Oct 02 13:01:18 crc kubenswrapper[4724]: I1002 13:01:18.713614 4724 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-hdbqt"] Oct 02 13:01:18 crc kubenswrapper[4724]: I1002 13:01:18.714679 4724 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-4lzm2"] Oct 02 13:01:18 crc kubenswrapper[4724]: I1002 13:01:18.715641 4724 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-vg57j"] Oct 02 13:01:18 crc kubenswrapper[4724]: I1002 13:01:18.716585 4724 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-xnt99"] Oct 02 13:01:18 crc kubenswrapper[4724]: I1002 13:01:18.717563 4724 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-j7cp6"] Oct 02 13:01:18 crc kubenswrapper[4724]: I1002 13:01:18.718783 4724 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-v5w5d"] Oct 02 13:01:18 crc kubenswrapper[4724]: I1002 13:01:18.719524 4724 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-t8hf4"] Oct 02 13:01:18 crc kubenswrapper[4724]: I1002 13:01:18.720596 4724 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-f2jql"] Oct 02 13:01:18 crc kubenswrapper[4724]: I1002 13:01:18.721613 4724 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-dlnq8"] Oct 02 13:01:18 crc kubenswrapper[4724]: I1002 13:01:18.722858 4724 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-m6knf"] Oct 02 13:01:18 crc kubenswrapper[4724]: I1002 13:01:18.724626 4724 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-jpfxn"] Oct 02 13:01:18 crc kubenswrapper[4724]: I1002 13:01:18.726336 4724 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-tq9w6"] Oct 02 13:01:18 crc kubenswrapper[4724]: I1002 13:01:18.727599 4724 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-klq6x"] Oct 02 13:01:18 crc kubenswrapper[4724]: I1002 13:01:18.727848 4724 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Oct 02 13:01:18 crc kubenswrapper[4724]: I1002 13:01:18.728692 4724 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-cwvv5"] Oct 02 13:01:18 crc kubenswrapper[4724]: I1002 13:01:18.730617 4724 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-8zmbp"] Oct 02 13:01:18 crc kubenswrapper[4724]: I1002 13:01:18.732934 4724 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29323500-vz2xt"] Oct 02 13:01:18 crc kubenswrapper[4724]: I1002 13:01:18.734086 4724 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-lqm4v"] Oct 02 13:01:18 crc kubenswrapper[4724]: I1002 13:01:18.734664 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-lqm4v" Oct 02 13:01:18 crc kubenswrapper[4724]: I1002 13:01:18.735877 4724 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-server-kw2qg"] Oct 02 13:01:18 crc kubenswrapper[4724]: I1002 13:01:18.736242 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-kw2qg" Oct 02 13:01:18 crc kubenswrapper[4724]: I1002 13:01:18.737562 4724 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-4rwfr"] Oct 02 13:01:18 crc kubenswrapper[4724]: I1002 13:01:18.738718 4724 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-lqm4v"] Oct 02 13:01:18 crc kubenswrapper[4724]: I1002 13:01:18.739918 4724 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-wjrm6"] Oct 02 13:01:18 crc kubenswrapper[4724]: I1002 13:01:18.741030 4724 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-z6hlf"] Oct 02 13:01:18 crc kubenswrapper[4724]: I1002 13:01:18.742941 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/7aab6527-d135-45a0-8fe0-99de1fd40d3d-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-m8t6v\" (UID: \"7aab6527-d135-45a0-8fe0-99de1fd40d3d\") " pod="openshift-authentication/oauth-openshift-558db77b4-m8t6v" Oct 02 13:01:18 crc kubenswrapper[4724]: I1002 13:01:18.742989 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/7aab6527-d135-45a0-8fe0-99de1fd40d3d-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-m8t6v\" (UID: \"7aab6527-d135-45a0-8fe0-99de1fd40d3d\") " pod="openshift-authentication/oauth-openshift-558db77b4-m8t6v" Oct 02 13:01:18 crc kubenswrapper[4724]: I1002 13:01:18.743027 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/7aab6527-d135-45a0-8fe0-99de1fd40d3d-audit-dir\") pod \"oauth-openshift-558db77b4-m8t6v\" (UID: \"7aab6527-d135-45a0-8fe0-99de1fd40d3d\") " pod="openshift-authentication/oauth-openshift-558db77b4-m8t6v" Oct 02 13:01:18 crc kubenswrapper[4724]: I1002 13:01:18.743046 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/7aab6527-d135-45a0-8fe0-99de1fd40d3d-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-m8t6v\" (UID: \"7aab6527-d135-45a0-8fe0-99de1fd40d3d\") " pod="openshift-authentication/oauth-openshift-558db77b4-m8t6v" Oct 02 13:01:18 crc kubenswrapper[4724]: I1002 13:01:18.743075 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/7aab6527-d135-45a0-8fe0-99de1fd40d3d-audit-policies\") pod \"oauth-openshift-558db77b4-m8t6v\" (UID: \"7aab6527-d135-45a0-8fe0-99de1fd40d3d\") " pod="openshift-authentication/oauth-openshift-558db77b4-m8t6v" Oct 02 13:01:18 crc kubenswrapper[4724]: I1002 13:01:18.743092 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/7aab6527-d135-45a0-8fe0-99de1fd40d3d-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-m8t6v\" (UID: \"7aab6527-d135-45a0-8fe0-99de1fd40d3d\") " pod="openshift-authentication/oauth-openshift-558db77b4-m8t6v" Oct 02 13:01:18 crc kubenswrapper[4724]: I1002 13:01:18.743115 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/7aab6527-d135-45a0-8fe0-99de1fd40d3d-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-m8t6v\" (UID: \"7aab6527-d135-45a0-8fe0-99de1fd40d3d\") " pod="openshift-authentication/oauth-openshift-558db77b4-m8t6v" Oct 02 13:01:18 crc kubenswrapper[4724]: I1002 13:01:18.743131 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7aab6527-d135-45a0-8fe0-99de1fd40d3d-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-m8t6v\" (UID: \"7aab6527-d135-45a0-8fe0-99de1fd40d3d\") " pod="openshift-authentication/oauth-openshift-558db77b4-m8t6v" Oct 02 13:01:18 crc kubenswrapper[4724]: I1002 13:01:18.743161 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/7aab6527-d135-45a0-8fe0-99de1fd40d3d-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-m8t6v\" (UID: \"7aab6527-d135-45a0-8fe0-99de1fd40d3d\") " pod="openshift-authentication/oauth-openshift-558db77b4-m8t6v" Oct 02 13:01:18 crc kubenswrapper[4724]: I1002 13:01:18.743176 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/7aab6527-d135-45a0-8fe0-99de1fd40d3d-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-m8t6v\" (UID: \"7aab6527-d135-45a0-8fe0-99de1fd40d3d\") " pod="openshift-authentication/oauth-openshift-558db77b4-m8t6v" Oct 02 13:01:18 crc kubenswrapper[4724]: I1002 13:01:18.743193 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/7aab6527-d135-45a0-8fe0-99de1fd40d3d-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-m8t6v\" (UID: \"7aab6527-d135-45a0-8fe0-99de1fd40d3d\") " pod="openshift-authentication/oauth-openshift-558db77b4-m8t6v" Oct 02 13:01:18 crc kubenswrapper[4724]: I1002 13:01:18.743208 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zrb6x\" (UniqueName: \"kubernetes.io/projected/84a3f9d9-8a1f-45ed-ad6f-3c8eb02738e7-kube-api-access-zrb6x\") pod \"downloads-7954f5f757-kl4xt\" (UID: \"84a3f9d9-8a1f-45ed-ad6f-3c8eb02738e7\") " pod="openshift-console/downloads-7954f5f757-kl4xt" Oct 02 13:01:18 crc kubenswrapper[4724]: I1002 13:01:18.743231 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/7aab6527-d135-45a0-8fe0-99de1fd40d3d-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-m8t6v\" (UID: \"7aab6527-d135-45a0-8fe0-99de1fd40d3d\") " pod="openshift-authentication/oauth-openshift-558db77b4-m8t6v" Oct 02 13:01:18 crc kubenswrapper[4724]: I1002 13:01:18.743246 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/7aab6527-d135-45a0-8fe0-99de1fd40d3d-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-m8t6v\" (UID: \"7aab6527-d135-45a0-8fe0-99de1fd40d3d\") " pod="openshift-authentication/oauth-openshift-558db77b4-m8t6v" Oct 02 13:01:18 crc kubenswrapper[4724]: I1002 13:01:18.743285 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-szdn2\" (UniqueName: \"kubernetes.io/projected/7aab6527-d135-45a0-8fe0-99de1fd40d3d-kube-api-access-szdn2\") pod \"oauth-openshift-558db77b4-m8t6v\" (UID: \"7aab6527-d135-45a0-8fe0-99de1fd40d3d\") " pod="openshift-authentication/oauth-openshift-558db77b4-m8t6v" Oct 02 13:01:18 crc kubenswrapper[4724]: I1002 13:01:18.743771 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/7aab6527-d135-45a0-8fe0-99de1fd40d3d-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-m8t6v\" (UID: \"7aab6527-d135-45a0-8fe0-99de1fd40d3d\") " pod="openshift-authentication/oauth-openshift-558db77b4-m8t6v" Oct 02 13:01:18 crc kubenswrapper[4724]: I1002 13:01:18.744693 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/7aab6527-d135-45a0-8fe0-99de1fd40d3d-audit-dir\") pod \"oauth-openshift-558db77b4-m8t6v\" (UID: \"7aab6527-d135-45a0-8fe0-99de1fd40d3d\") " pod="openshift-authentication/oauth-openshift-558db77b4-m8t6v" Oct 02 13:01:18 crc kubenswrapper[4724]: I1002 13:01:18.744897 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/7aab6527-d135-45a0-8fe0-99de1fd40d3d-audit-policies\") pod \"oauth-openshift-558db77b4-m8t6v\" (UID: \"7aab6527-d135-45a0-8fe0-99de1fd40d3d\") " pod="openshift-authentication/oauth-openshift-558db77b4-m8t6v" Oct 02 13:01:18 crc kubenswrapper[4724]: I1002 13:01:18.744926 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/7aab6527-d135-45a0-8fe0-99de1fd40d3d-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-m8t6v\" (UID: \"7aab6527-d135-45a0-8fe0-99de1fd40d3d\") " pod="openshift-authentication/oauth-openshift-558db77b4-m8t6v" Oct 02 13:01:18 crc kubenswrapper[4724]: I1002 13:01:18.745849 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7aab6527-d135-45a0-8fe0-99de1fd40d3d-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-m8t6v\" (UID: \"7aab6527-d135-45a0-8fe0-99de1fd40d3d\") " pod="openshift-authentication/oauth-openshift-558db77b4-m8t6v" Oct 02 13:01:18 crc kubenswrapper[4724]: I1002 13:01:18.747276 4724 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Oct 02 13:01:18 crc kubenswrapper[4724]: I1002 13:01:18.747423 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/7aab6527-d135-45a0-8fe0-99de1fd40d3d-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-m8t6v\" (UID: \"7aab6527-d135-45a0-8fe0-99de1fd40d3d\") " pod="openshift-authentication/oauth-openshift-558db77b4-m8t6v" Oct 02 13:01:18 crc kubenswrapper[4724]: I1002 13:01:18.747502 4724 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-5mxg5"] Oct 02 13:01:18 crc kubenswrapper[4724]: I1002 13:01:18.749818 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/7aab6527-d135-45a0-8fe0-99de1fd40d3d-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-m8t6v\" (UID: \"7aab6527-d135-45a0-8fe0-99de1fd40d3d\") " pod="openshift-authentication/oauth-openshift-558db77b4-m8t6v" Oct 02 13:01:18 crc kubenswrapper[4724]: I1002 13:01:18.755288 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/7aab6527-d135-45a0-8fe0-99de1fd40d3d-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-m8t6v\" (UID: \"7aab6527-d135-45a0-8fe0-99de1fd40d3d\") " pod="openshift-authentication/oauth-openshift-558db77b4-m8t6v" Oct 02 13:01:18 crc kubenswrapper[4724]: I1002 13:01:18.755789 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/7aab6527-d135-45a0-8fe0-99de1fd40d3d-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-m8t6v\" (UID: \"7aab6527-d135-45a0-8fe0-99de1fd40d3d\") " pod="openshift-authentication/oauth-openshift-558db77b4-m8t6v" Oct 02 13:01:18 crc kubenswrapper[4724]: I1002 13:01:18.755879 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/7aab6527-d135-45a0-8fe0-99de1fd40d3d-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-m8t6v\" (UID: \"7aab6527-d135-45a0-8fe0-99de1fd40d3d\") " pod="openshift-authentication/oauth-openshift-558db77b4-m8t6v" Oct 02 13:01:18 crc kubenswrapper[4724]: I1002 13:01:18.756136 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/7aab6527-d135-45a0-8fe0-99de1fd40d3d-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-m8t6v\" (UID: \"7aab6527-d135-45a0-8fe0-99de1fd40d3d\") " pod="openshift-authentication/oauth-openshift-558db77b4-m8t6v" Oct 02 13:01:18 crc kubenswrapper[4724]: I1002 13:01:18.756388 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/7aab6527-d135-45a0-8fe0-99de1fd40d3d-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-m8t6v\" (UID: \"7aab6527-d135-45a0-8fe0-99de1fd40d3d\") " pod="openshift-authentication/oauth-openshift-558db77b4-m8t6v" Oct 02 13:01:18 crc kubenswrapper[4724]: I1002 13:01:18.759011 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/7aab6527-d135-45a0-8fe0-99de1fd40d3d-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-m8t6v\" (UID: \"7aab6527-d135-45a0-8fe0-99de1fd40d3d\") " pod="openshift-authentication/oauth-openshift-558db77b4-m8t6v" Oct 02 13:01:18 crc kubenswrapper[4724]: I1002 13:01:18.769644 4724 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Oct 02 13:01:18 crc kubenswrapper[4724]: I1002 13:01:18.783173 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/27c39377-9fc1-4ca9-8ce4-8a1c61f181c0-images\") pod \"machine-api-operator-5694c8668f-x726v\" (UID: \"27c39377-9fc1-4ca9-8ce4-8a1c61f181c0\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-x726v" Oct 02 13:01:18 crc kubenswrapper[4724]: I1002 13:01:18.789007 4724 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Oct 02 13:01:18 crc kubenswrapper[4724]: I1002 13:01:18.807234 4724 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Oct 02 13:01:18 crc kubenswrapper[4724]: I1002 13:01:18.827791 4724 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Oct 02 13:01:18 crc kubenswrapper[4724]: I1002 13:01:18.847728 4724 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Oct 02 13:01:18 crc kubenswrapper[4724]: I1002 13:01:18.868391 4724 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Oct 02 13:01:18 crc kubenswrapper[4724]: I1002 13:01:18.888496 4724 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Oct 02 13:01:18 crc kubenswrapper[4724]: I1002 13:01:18.908078 4724 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Oct 02 13:01:18 crc kubenswrapper[4724]: I1002 13:01:18.927441 4724 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Oct 02 13:01:18 crc kubenswrapper[4724]: I1002 13:01:18.947810 4724 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Oct 02 13:01:18 crc kubenswrapper[4724]: I1002 13:01:18.968042 4724 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Oct 02 13:01:18 crc kubenswrapper[4724]: I1002 13:01:18.988260 4724 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Oct 02 13:01:19 crc kubenswrapper[4724]: I1002 13:01:19.008317 4724 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Oct 02 13:01:19 crc kubenswrapper[4724]: I1002 13:01:19.028283 4724 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Oct 02 13:01:19 crc kubenswrapper[4724]: I1002 13:01:19.049341 4724 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Oct 02 13:01:19 crc kubenswrapper[4724]: I1002 13:01:19.068893 4724 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Oct 02 13:01:19 crc kubenswrapper[4724]: I1002 13:01:19.087993 4724 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Oct 02 13:01:19 crc kubenswrapper[4724]: I1002 13:01:19.108817 4724 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Oct 02 13:01:19 crc kubenswrapper[4724]: I1002 13:01:19.127653 4724 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Oct 02 13:01:19 crc kubenswrapper[4724]: I1002 13:01:19.148286 4724 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Oct 02 13:01:19 crc kubenswrapper[4724]: I1002 13:01:19.167891 4724 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Oct 02 13:01:19 crc kubenswrapper[4724]: I1002 13:01:19.187941 4724 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Oct 02 13:01:19 crc kubenswrapper[4724]: I1002 13:01:19.208431 4724 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Oct 02 13:01:19 crc kubenswrapper[4724]: I1002 13:01:19.227938 4724 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Oct 02 13:01:19 crc kubenswrapper[4724]: I1002 13:01:19.247991 4724 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Oct 02 13:01:19 crc kubenswrapper[4724]: I1002 13:01:19.268053 4724 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Oct 02 13:01:19 crc kubenswrapper[4724]: I1002 13:01:19.288241 4724 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Oct 02 13:01:19 crc kubenswrapper[4724]: I1002 13:01:19.307713 4724 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Oct 02 13:01:19 crc kubenswrapper[4724]: I1002 13:01:19.328820 4724 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Oct 02 13:01:19 crc kubenswrapper[4724]: I1002 13:01:19.348344 4724 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Oct 02 13:01:19 crc kubenswrapper[4724]: I1002 13:01:19.373853 4724 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Oct 02 13:01:19 crc kubenswrapper[4724]: I1002 13:01:19.388514 4724 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Oct 02 13:01:19 crc kubenswrapper[4724]: I1002 13:01:19.407603 4724 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Oct 02 13:01:19 crc kubenswrapper[4724]: I1002 13:01:19.427755 4724 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Oct 02 13:01:19 crc kubenswrapper[4724]: I1002 13:01:19.448236 4724 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Oct 02 13:01:19 crc kubenswrapper[4724]: I1002 13:01:19.467333 4724 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Oct 02 13:01:19 crc kubenswrapper[4724]: I1002 13:01:19.487645 4724 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Oct 02 13:01:19 crc kubenswrapper[4724]: I1002 13:01:19.507596 4724 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Oct 02 13:01:19 crc kubenswrapper[4724]: I1002 13:01:19.527752 4724 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Oct 02 13:01:19 crc kubenswrapper[4724]: I1002 13:01:19.548355 4724 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Oct 02 13:01:19 crc kubenswrapper[4724]: I1002 13:01:19.568255 4724 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Oct 02 13:01:19 crc kubenswrapper[4724]: I1002 13:01:19.590713 4724 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Oct 02 13:01:19 crc kubenswrapper[4724]: I1002 13:01:19.608156 4724 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Oct 02 13:01:19 crc kubenswrapper[4724]: I1002 13:01:19.628067 4724 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Oct 02 13:01:19 crc kubenswrapper[4724]: I1002 13:01:19.646451 4724 request.go:700] Waited for 1.017253426s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-operator-lifecycle-manager/secrets?fieldSelector=metadata.name%3Dcatalog-operator-serving-cert&limit=500&resourceVersion=0 Oct 02 13:01:19 crc kubenswrapper[4724]: I1002 13:01:19.648939 4724 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Oct 02 13:01:19 crc kubenswrapper[4724]: I1002 13:01:19.707658 4724 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Oct 02 13:01:19 crc kubenswrapper[4724]: I1002 13:01:19.728656 4724 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Oct 02 13:01:19 crc kubenswrapper[4724]: I1002 13:01:19.748243 4724 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Oct 02 13:01:19 crc kubenswrapper[4724]: I1002 13:01:19.768067 4724 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Oct 02 13:01:19 crc kubenswrapper[4724]: I1002 13:01:19.788663 4724 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Oct 02 13:01:19 crc kubenswrapper[4724]: I1002 13:01:19.809429 4724 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Oct 02 13:01:19 crc kubenswrapper[4724]: I1002 13:01:19.828835 4724 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Oct 02 13:01:19 crc kubenswrapper[4724]: I1002 13:01:19.849306 4724 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Oct 02 13:01:19 crc kubenswrapper[4724]: I1002 13:01:19.868081 4724 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Oct 02 13:01:19 crc kubenswrapper[4724]: I1002 13:01:19.888411 4724 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Oct 02 13:01:19 crc kubenswrapper[4724]: I1002 13:01:19.908962 4724 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Oct 02 13:01:19 crc kubenswrapper[4724]: I1002 13:01:19.928685 4724 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Oct 02 13:01:19 crc kubenswrapper[4724]: I1002 13:01:19.963849 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c8k7f\" (UniqueName: \"kubernetes.io/projected/bbeff7ab-5d85-4709-bd85-22d6e99ff30c-kube-api-access-c8k7f\") pod \"route-controller-manager-6576b87f9c-qxfvd\" (UID: \"bbeff7ab-5d85-4709-bd85-22d6e99ff30c\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-qxfvd" Oct 02 13:01:19 crc kubenswrapper[4724]: I1002 13:01:19.984251 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-48p5f\" (UniqueName: \"kubernetes.io/projected/4501d69c-964b-4444-b8af-d56b9301a685-kube-api-access-48p5f\") pod \"apiserver-7bbb656c7d-n4xw8\" (UID: \"4501d69c-964b-4444-b8af-d56b9301a685\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-n4xw8" Oct 02 13:01:20 crc kubenswrapper[4724]: I1002 13:01:20.002120 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b8bl8\" (UniqueName: \"kubernetes.io/projected/da7051e3-8a79-43e7-9016-9d492b51a9fd-kube-api-access-b8bl8\") pod \"apiserver-76f77b778f-xjrx9\" (UID: \"da7051e3-8a79-43e7-9016-9d492b51a9fd\") " pod="openshift-apiserver/apiserver-76f77b778f-xjrx9" Oct 02 13:01:20 crc kubenswrapper[4724]: I1002 13:01:20.029761 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wjrjd\" (UniqueName: \"kubernetes.io/projected/4f4b9ef0-2fa1-4d48-81c9-e428e93c7034-kube-api-access-wjrjd\") pod \"openshift-config-operator-7777fb866f-974gc\" (UID: \"4f4b9ef0-2fa1-4d48-81c9-e428e93c7034\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-974gc" Oct 02 13:01:20 crc kubenswrapper[4724]: I1002 13:01:20.045581 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-xjrx9" Oct 02 13:01:20 crc kubenswrapper[4724]: I1002 13:01:20.049225 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xn64z\" (UniqueName: \"kubernetes.io/projected/9699ff16-3d72-4ba6-9055-6b707c3e223f-kube-api-access-xn64z\") pod \"controller-manager-879f6c89f-zpmh7\" (UID: \"9699ff16-3d72-4ba6-9055-6b707c3e223f\") " pod="openshift-controller-manager/controller-manager-879f6c89f-zpmh7" Oct 02 13:01:20 crc kubenswrapper[4724]: I1002 13:01:20.065122 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mr526\" (UniqueName: \"kubernetes.io/projected/e0f09ca7-146f-44c3-9137-76cff079b5bc-kube-api-access-mr526\") pod \"authentication-operator-69f744f599-st27v\" (UID: \"e0f09ca7-146f-44c3-9137-76cff079b5bc\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-st27v" Oct 02 13:01:20 crc kubenswrapper[4724]: I1002 13:01:20.069875 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-qxfvd" Oct 02 13:01:20 crc kubenswrapper[4724]: I1002 13:01:20.084156 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mrq7m\" (UniqueName: \"kubernetes.io/projected/486adec0-0d45-4294-ac1b-5fff0dda6602-kube-api-access-mrq7m\") pod \"machine-approver-56656f9798-zl9kr\" (UID: \"486adec0-0d45-4294-ac1b-5fff0dda6602\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-zl9kr" Oct 02 13:01:20 crc kubenswrapper[4724]: I1002 13:01:20.099376 4724 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Oct 02 13:01:20 crc kubenswrapper[4724]: I1002 13:01:20.103031 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-zl9kr" Oct 02 13:01:20 crc kubenswrapper[4724]: I1002 13:01:20.107786 4724 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Oct 02 13:01:20 crc kubenswrapper[4724]: W1002 13:01:20.115458 4724 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod486adec0_0d45_4294_ac1b_5fff0dda6602.slice/crio-527c7d77789e4877acc20367919fd404e582314766281f21e202217b1e5c53db WatchSource:0}: Error finding container 527c7d77789e4877acc20367919fd404e582314766281f21e202217b1e5c53db: Status 404 returned error can't find the container with id 527c7d77789e4877acc20367919fd404e582314766281f21e202217b1e5c53db Oct 02 13:01:20 crc kubenswrapper[4724]: I1002 13:01:20.121800 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-zpmh7" Oct 02 13:01:20 crc kubenswrapper[4724]: I1002 13:01:20.127859 4724 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Oct 02 13:01:20 crc kubenswrapper[4724]: I1002 13:01:20.137862 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-st27v" Oct 02 13:01:20 crc kubenswrapper[4724]: I1002 13:01:20.145012 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-n4xw8" Oct 02 13:01:20 crc kubenswrapper[4724]: I1002 13:01:20.148499 4724 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Oct 02 13:01:20 crc kubenswrapper[4724]: I1002 13:01:20.187398 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5gg2l\" (UniqueName: \"kubernetes.io/projected/27c39377-9fc1-4ca9-8ce4-8a1c61f181c0-kube-api-access-5gg2l\") pod \"machine-api-operator-5694c8668f-x726v\" (UID: \"27c39377-9fc1-4ca9-8ce4-8a1c61f181c0\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-x726v" Oct 02 13:01:20 crc kubenswrapper[4724]: I1002 13:01:20.188028 4724 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Oct 02 13:01:20 crc kubenswrapper[4724]: I1002 13:01:20.204413 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-974gc" Oct 02 13:01:20 crc kubenswrapper[4724]: I1002 13:01:20.209208 4724 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Oct 02 13:01:20 crc kubenswrapper[4724]: I1002 13:01:20.228902 4724 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Oct 02 13:01:20 crc kubenswrapper[4724]: I1002 13:01:20.248975 4724 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Oct 02 13:01:20 crc kubenswrapper[4724]: I1002 13:01:20.268901 4724 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Oct 02 13:01:20 crc kubenswrapper[4724]: I1002 13:01:20.288236 4724 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Oct 02 13:01:20 crc kubenswrapper[4724]: I1002 13:01:20.307521 4724 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Oct 02 13:01:20 crc kubenswrapper[4724]: I1002 13:01:20.329011 4724 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Oct 02 13:01:20 crc kubenswrapper[4724]: I1002 13:01:20.329575 4724 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-qxfvd"] Oct 02 13:01:20 crc kubenswrapper[4724]: I1002 13:01:20.329605 4724 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-xjrx9"] Oct 02 13:01:20 crc kubenswrapper[4724]: I1002 13:01:20.348126 4724 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Oct 02 13:01:20 crc kubenswrapper[4724]: I1002 13:01:20.357224 4724 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-zpmh7"] Oct 02 13:01:20 crc kubenswrapper[4724]: I1002 13:01:20.358848 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-x726v" Oct 02 13:01:20 crc kubenswrapper[4724]: I1002 13:01:20.368201 4724 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Oct 02 13:01:20 crc kubenswrapper[4724]: I1002 13:01:20.388564 4724 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Oct 02 13:01:20 crc kubenswrapper[4724]: I1002 13:01:20.408638 4724 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Oct 02 13:01:20 crc kubenswrapper[4724]: I1002 13:01:20.427994 4724 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Oct 02 13:01:20 crc kubenswrapper[4724]: W1002 13:01:20.428805 4724 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9699ff16_3d72_4ba6_9055_6b707c3e223f.slice/crio-a74d4dce04f0ae9b58c7b830d83ad2914b2ce61a3f7091f8cc6396c4dd547327 WatchSource:0}: Error finding container a74d4dce04f0ae9b58c7b830d83ad2914b2ce61a3f7091f8cc6396c4dd547327: Status 404 returned error can't find the container with id a74d4dce04f0ae9b58c7b830d83ad2914b2ce61a3f7091f8cc6396c4dd547327 Oct 02 13:01:20 crc kubenswrapper[4724]: I1002 13:01:20.434392 4724 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-974gc"] Oct 02 13:01:20 crc kubenswrapper[4724]: I1002 13:01:20.447471 4724 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Oct 02 13:01:20 crc kubenswrapper[4724]: I1002 13:01:20.467277 4724 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Oct 02 13:01:20 crc kubenswrapper[4724]: I1002 13:01:20.488725 4724 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Oct 02 13:01:20 crc kubenswrapper[4724]: I1002 13:01:20.507778 4724 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Oct 02 13:01:20 crc kubenswrapper[4724]: I1002 13:01:20.557978 4724 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Oct 02 13:01:20 crc kubenswrapper[4724]: I1002 13:01:20.559061 4724 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Oct 02 13:01:20 crc kubenswrapper[4724]: I1002 13:01:20.567661 4724 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Oct 02 13:01:20 crc kubenswrapper[4724]: I1002 13:01:20.589346 4724 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Oct 02 13:01:20 crc kubenswrapper[4724]: I1002 13:01:20.591652 4724 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-n4xw8"] Oct 02 13:01:20 crc kubenswrapper[4724]: I1002 13:01:20.599992 4724 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-st27v"] Oct 02 13:01:20 crc kubenswrapper[4724]: I1002 13:01:20.607930 4724 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Oct 02 13:01:20 crc kubenswrapper[4724]: W1002 13:01:20.610730 4724 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode0f09ca7_146f_44c3_9137_76cff079b5bc.slice/crio-5f11f3ffac03c9b6aae7003a768af0406d7d4b2195af2b869abf6520e77c448b WatchSource:0}: Error finding container 5f11f3ffac03c9b6aae7003a768af0406d7d4b2195af2b869abf6520e77c448b: Status 404 returned error can't find the container with id 5f11f3ffac03c9b6aae7003a768af0406d7d4b2195af2b869abf6520e77c448b Oct 02 13:01:20 crc kubenswrapper[4724]: W1002 13:01:20.618809 4724 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4501d69c_964b_4444_b8af_d56b9301a685.slice/crio-b1b0cd9e3fa1c32b5017d4239eded774916d049644505b35eaad30d3a3e9896c WatchSource:0}: Error finding container b1b0cd9e3fa1c32b5017d4239eded774916d049644505b35eaad30d3a3e9896c: Status 404 returned error can't find the container with id b1b0cd9e3fa1c32b5017d4239eded774916d049644505b35eaad30d3a3e9896c Oct 02 13:01:20 crc kubenswrapper[4724]: I1002 13:01:20.627859 4724 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Oct 02 13:01:20 crc kubenswrapper[4724]: I1002 13:01:20.647582 4724 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Oct 02 13:01:20 crc kubenswrapper[4724]: I1002 13:01:20.652042 4724 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-x726v"] Oct 02 13:01:20 crc kubenswrapper[4724]: I1002 13:01:20.666819 4724 request.go:700] Waited for 1.930409159s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-machine-config-operator/secrets?fieldSelector=metadata.name%3Dmachine-config-server-tls&limit=500&resourceVersion=0 Oct 02 13:01:20 crc kubenswrapper[4724]: I1002 13:01:20.668182 4724 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Oct 02 13:01:20 crc kubenswrapper[4724]: I1002 13:01:20.688344 4724 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Oct 02 13:01:20 crc kubenswrapper[4724]: I1002 13:01:20.708663 4724 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Oct 02 13:01:20 crc kubenswrapper[4724]: I1002 13:01:20.744468 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-szdn2\" (UniqueName: \"kubernetes.io/projected/7aab6527-d135-45a0-8fe0-99de1fd40d3d-kube-api-access-szdn2\") pod \"oauth-openshift-558db77b4-m8t6v\" (UID: \"7aab6527-d135-45a0-8fe0-99de1fd40d3d\") " pod="openshift-authentication/oauth-openshift-558db77b4-m8t6v" Oct 02 13:01:20 crc kubenswrapper[4724]: I1002 13:01:20.763789 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zrb6x\" (UniqueName: \"kubernetes.io/projected/84a3f9d9-8a1f-45ed-ad6f-3c8eb02738e7-kube-api-access-zrb6x\") pod \"downloads-7954f5f757-kl4xt\" (UID: \"84a3f9d9-8a1f-45ed-ad6f-3c8eb02738e7\") " pod="openshift-console/downloads-7954f5f757-kl4xt" Oct 02 13:01:20 crc kubenswrapper[4724]: I1002 13:01:20.818745 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-m8t6v" Oct 02 13:01:20 crc kubenswrapper[4724]: I1002 13:01:20.834978 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-kl4xt" Oct 02 13:01:20 crc kubenswrapper[4724]: I1002 13:01:20.867247 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/f5bb856b-60df-44d0-9979-906fc271f66e-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-qwb55\" (UID: \"f5bb856b-60df-44d0-9979-906fc271f66e\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-qwb55" Oct 02 13:01:20 crc kubenswrapper[4724]: I1002 13:01:20.867638 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/131c7969-a8c8-4cfc-b655-ac3d400fae1b-oauth-serving-cert\") pod \"console-f9d7485db-lvb24\" (UID: \"131c7969-a8c8-4cfc-b655-ac3d400fae1b\") " pod="openshift-console/console-f9d7485db-lvb24" Oct 02 13:01:20 crc kubenswrapper[4724]: I1002 13:01:20.867677 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-v5w5d\" (UID: \"7de6408f-76b4-4a9a-bf83-fe6c4e60848e\") " pod="openshift-image-registry/image-registry-697d97f7c8-v5w5d" Oct 02 13:01:20 crc kubenswrapper[4724]: I1002 13:01:20.867696 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/152155f3-933d-43c5-abeb-7c06899a6939-trusted-ca\") pod \"console-operator-58897d9998-v94bt\" (UID: \"152155f3-933d-43c5-abeb-7c06899a6939\") " pod="openshift-console-operator/console-operator-58897d9998-v94bt" Oct 02 13:01:20 crc kubenswrapper[4724]: I1002 13:01:20.867918 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rn974\" (UniqueName: \"kubernetes.io/projected/ac67382d-e26c-48c3-933e-19fecd4d5d49-kube-api-access-rn974\") pod \"openshift-apiserver-operator-796bbdcf4f-jpfxn\" (UID: \"ac67382d-e26c-48c3-933e-19fecd4d5d49\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-jpfxn" Oct 02 13:01:20 crc kubenswrapper[4724]: I1002 13:01:20.868252 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/7d87bc9c-fb3d-42c1-94c9-ec34e34d0f16-proxy-tls\") pod \"machine-config-operator-74547568cd-vg57j\" (UID: \"7d87bc9c-fb3d-42c1-94c9-ec34e34d0f16\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-vg57j" Oct 02 13:01:20 crc kubenswrapper[4724]: I1002 13:01:20.868281 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6578bd1a-eaad-452a-adf5-7f3e34838677-service-ca-bundle\") pod \"router-default-5444994796-n5rln\" (UID: \"6578bd1a-eaad-452a-adf5-7f3e34838677\") " pod="openshift-ingress/router-default-5444994796-n5rln" Oct 02 13:01:20 crc kubenswrapper[4724]: E1002 13:01:20.868345 4724 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 13:01:21.368330117 +0000 UTC m=+145.823089318 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-v5w5d" (UID: "7de6408f-76b4-4a9a-bf83-fe6c4e60848e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 13:01:20 crc kubenswrapper[4724]: I1002 13:01:20.868510 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f0cb7f77-88ee-46c4-9c81-aa953416aec1-srv-cert\") pod \"olm-operator-6b444d44fb-d2rqx\" (UID: \"f0cb7f77-88ee-46c4-9c81-aa953416aec1\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-d2rqx" Oct 02 13:01:20 crc kubenswrapper[4724]: I1002 13:01:20.868689 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f0cb7f77-88ee-46c4-9c81-aa953416aec1-profile-collector-cert\") pod \"olm-operator-6b444d44fb-d2rqx\" (UID: \"f0cb7f77-88ee-46c4-9c81-aa953416aec1\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-d2rqx" Oct 02 13:01:20 crc kubenswrapper[4724]: I1002 13:01:20.868718 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/131c7969-a8c8-4cfc-b655-ac3d400fae1b-console-serving-cert\") pod \"console-f9d7485db-lvb24\" (UID: \"131c7969-a8c8-4cfc-b655-ac3d400fae1b\") " pod="openshift-console/console-f9d7485db-lvb24" Oct 02 13:01:20 crc kubenswrapper[4724]: I1002 13:01:20.868897 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/7de6408f-76b4-4a9a-bf83-fe6c4e60848e-registry-tls\") pod \"image-registry-697d97f7c8-v5w5d\" (UID: \"7de6408f-76b4-4a9a-bf83-fe6c4e60848e\") " pod="openshift-image-registry/image-registry-697d97f7c8-v5w5d" Oct 02 13:01:20 crc kubenswrapper[4724]: I1002 13:01:20.868947 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1d28f605-0e15-46b5-9c71-b6a123c0e0ce-config\") pod \"kube-controller-manager-operator-78b949d7b-fh5fg\" (UID: \"1d28f605-0e15-46b5-9c71-b6a123c0e0ce\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-fh5fg" Oct 02 13:01:20 crc kubenswrapper[4724]: I1002 13:01:20.868964 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6jvc4\" (UniqueName: \"kubernetes.io/projected/131c7969-a8c8-4cfc-b655-ac3d400fae1b-kube-api-access-6jvc4\") pod \"console-f9d7485db-lvb24\" (UID: \"131c7969-a8c8-4cfc-b655-ac3d400fae1b\") " pod="openshift-console/console-f9d7485db-lvb24" Oct 02 13:01:20 crc kubenswrapper[4724]: I1002 13:01:20.869135 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dvth8\" (UniqueName: \"kubernetes.io/projected/5a3d5592-2d16-4d75-a734-664f6dd16418-kube-api-access-dvth8\") pod \"etcd-operator-b45778765-9fscb\" (UID: \"5a3d5592-2d16-4d75-a734-664f6dd16418\") " pod="openshift-etcd-operator/etcd-operator-b45778765-9fscb" Oct 02 13:01:20 crc kubenswrapper[4724]: I1002 13:01:20.869259 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lfsx8\" (UniqueName: \"kubernetes.io/projected/6578bd1a-eaad-452a-adf5-7f3e34838677-kube-api-access-lfsx8\") pod \"router-default-5444994796-n5rln\" (UID: \"6578bd1a-eaad-452a-adf5-7f3e34838677\") " pod="openshift-ingress/router-default-5444994796-n5rln" Oct 02 13:01:20 crc kubenswrapper[4724]: I1002 13:01:20.869295 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/7de6408f-76b4-4a9a-bf83-fe6c4e60848e-registry-certificates\") pod \"image-registry-697d97f7c8-v5w5d\" (UID: \"7de6408f-76b4-4a9a-bf83-fe6c4e60848e\") " pod="openshift-image-registry/image-registry-697d97f7c8-v5w5d" Oct 02 13:01:20 crc kubenswrapper[4724]: I1002 13:01:20.869450 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/152155f3-933d-43c5-abeb-7c06899a6939-config\") pod \"console-operator-58897d9998-v94bt\" (UID: \"152155f3-933d-43c5-abeb-7c06899a6939\") " pod="openshift-console-operator/console-operator-58897d9998-v94bt" Oct 02 13:01:20 crc kubenswrapper[4724]: I1002 13:01:20.869586 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/152155f3-933d-43c5-abeb-7c06899a6939-serving-cert\") pod \"console-operator-58897d9998-v94bt\" (UID: \"152155f3-933d-43c5-abeb-7c06899a6939\") " pod="openshift-console-operator/console-operator-58897d9998-v94bt" Oct 02 13:01:20 crc kubenswrapper[4724]: I1002 13:01:20.869609 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5a3d5592-2d16-4d75-a734-664f6dd16418-config\") pod \"etcd-operator-b45778765-9fscb\" (UID: \"5a3d5592-2d16-4d75-a734-664f6dd16418\") " pod="openshift-etcd-operator/etcd-operator-b45778765-9fscb" Oct 02 13:01:20 crc kubenswrapper[4724]: I1002 13:01:20.869913 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/7de6408f-76b4-4a9a-bf83-fe6c4e60848e-bound-sa-token\") pod \"image-registry-697d97f7c8-v5w5d\" (UID: \"7de6408f-76b4-4a9a-bf83-fe6c4e60848e\") " pod="openshift-image-registry/image-registry-697d97f7c8-v5w5d" Oct 02 13:01:20 crc kubenswrapper[4724]: I1002 13:01:20.870052 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/5a3d5592-2d16-4d75-a734-664f6dd16418-etcd-ca\") pod \"etcd-operator-b45778765-9fscb\" (UID: \"5a3d5592-2d16-4d75-a734-664f6dd16418\") " pod="openshift-etcd-operator/etcd-operator-b45778765-9fscb" Oct 02 13:01:20 crc kubenswrapper[4724]: I1002 13:01:20.870077 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/131c7969-a8c8-4cfc-b655-ac3d400fae1b-service-ca\") pod \"console-f9d7485db-lvb24\" (UID: \"131c7969-a8c8-4cfc-b655-ac3d400fae1b\") " pod="openshift-console/console-f9d7485db-lvb24" Oct 02 13:01:20 crc kubenswrapper[4724]: I1002 13:01:20.870273 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dthqw\" (UniqueName: \"kubernetes.io/projected/2b330608-20dc-445e-bf75-4393541c7fd4-kube-api-access-dthqw\") pod \"ingress-operator-5b745b69d9-hdbqt\" (UID: \"2b330608-20dc-445e-bf75-4393541c7fd4\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-hdbqt" Oct 02 13:01:20 crc kubenswrapper[4724]: I1002 13:01:20.870307 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ac67382d-e26c-48c3-933e-19fecd4d5d49-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-jpfxn\" (UID: \"ac67382d-e26c-48c3-933e-19fecd4d5d49\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-jpfxn" Oct 02 13:01:20 crc kubenswrapper[4724]: I1002 13:01:20.870342 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/131c7969-a8c8-4cfc-b655-ac3d400fae1b-console-oauth-config\") pod \"console-f9d7485db-lvb24\" (UID: \"131c7969-a8c8-4cfc-b655-ac3d400fae1b\") " pod="openshift-console/console-f9d7485db-lvb24" Oct 02 13:01:20 crc kubenswrapper[4724]: I1002 13:01:20.870370 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4wjlh\" (UniqueName: \"kubernetes.io/projected/f0cb7f77-88ee-46c4-9c81-aa953416aec1-kube-api-access-4wjlh\") pod \"olm-operator-6b444d44fb-d2rqx\" (UID: \"f0cb7f77-88ee-46c4-9c81-aa953416aec1\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-d2rqx" Oct 02 13:01:20 crc kubenswrapper[4724]: I1002 13:01:20.870389 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/42570dd4-bfb7-43a7-9f10-bf0df9236925-srv-cert\") pod \"catalog-operator-68c6474976-8zmbp\" (UID: \"42570dd4-bfb7-43a7-9f10-bf0df9236925\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-8zmbp" Oct 02 13:01:20 crc kubenswrapper[4724]: I1002 13:01:20.870674 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a9c024ff-b6a9-4b92-8c1d-debc51c10ec9-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-t8hf4\" (UID: \"a9c024ff-b6a9-4b92-8c1d-debc51c10ec9\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-t8hf4" Oct 02 13:01:20 crc kubenswrapper[4724]: I1002 13:01:20.870901 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-29qtp\" (UniqueName: \"kubernetes.io/projected/5111b397-09d5-4412-9135-2ea4914c00db-kube-api-access-29qtp\") pod \"openshift-controller-manager-operator-756b6f6bc6-dlnq8\" (UID: \"5111b397-09d5-4412-9135-2ea4914c00db\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-dlnq8" Oct 02 13:01:20 crc kubenswrapper[4724]: I1002 13:01:20.870929 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/7d87bc9c-fb3d-42c1-94c9-ec34e34d0f16-auth-proxy-config\") pod \"machine-config-operator-74547568cd-vg57j\" (UID: \"7d87bc9c-fb3d-42c1-94c9-ec34e34d0f16\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-vg57j" Oct 02 13:01:20 crc kubenswrapper[4724]: I1002 13:01:20.870956 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f5bb856b-60df-44d0-9979-906fc271f66e-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-qwb55\" (UID: \"f5bb856b-60df-44d0-9979-906fc271f66e\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-qwb55" Oct 02 13:01:20 crc kubenswrapper[4724]: I1002 13:01:20.870992 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5l7td\" (UniqueName: \"kubernetes.io/projected/42570dd4-bfb7-43a7-9f10-bf0df9236925-kube-api-access-5l7td\") pod \"catalog-operator-68c6474976-8zmbp\" (UID: \"42570dd4-bfb7-43a7-9f10-bf0df9236925\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-8zmbp" Oct 02 13:01:20 crc kubenswrapper[4724]: I1002 13:01:20.871017 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/7de6408f-76b4-4a9a-bf83-fe6c4e60848e-ca-trust-extracted\") pod \"image-registry-697d97f7c8-v5w5d\" (UID: \"7de6408f-76b4-4a9a-bf83-fe6c4e60848e\") " pod="openshift-image-registry/image-registry-697d97f7c8-v5w5d" Oct 02 13:01:20 crc kubenswrapper[4724]: I1002 13:01:20.871045 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/59e1178f-be06-4966-9e77-031df1e58c1a-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-tj5h8\" (UID: \"59e1178f-be06-4966-9e77-031df1e58c1a\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-tj5h8" Oct 02 13:01:20 crc kubenswrapper[4724]: I1002 13:01:20.871070 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/131c7969-a8c8-4cfc-b655-ac3d400fae1b-trusted-ca-bundle\") pod \"console-f9d7485db-lvb24\" (UID: \"131c7969-a8c8-4cfc-b655-ac3d400fae1b\") " pod="openshift-console/console-f9d7485db-lvb24" Oct 02 13:01:20 crc kubenswrapper[4724]: I1002 13:01:20.871103 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/2b330608-20dc-445e-bf75-4393541c7fd4-metrics-tls\") pod \"ingress-operator-5b745b69d9-hdbqt\" (UID: \"2b330608-20dc-445e-bf75-4393541c7fd4\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-hdbqt" Oct 02 13:01:20 crc kubenswrapper[4724]: I1002 13:01:20.871145 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1d28f605-0e15-46b5-9c71-b6a123c0e0ce-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-fh5fg\" (UID: \"1d28f605-0e15-46b5-9c71-b6a123c0e0ce\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-fh5fg" Oct 02 13:01:20 crc kubenswrapper[4724]: I1002 13:01:20.871179 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mrtn7\" (UniqueName: \"kubernetes.io/projected/21e8cad3-fa39-40e9-9d04-ff4dd12c3ec9-kube-api-access-mrtn7\") pod \"migrator-59844c95c7-f2jql\" (UID: \"21e8cad3-fa39-40e9-9d04-ff4dd12c3ec9\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-f2jql" Oct 02 13:01:20 crc kubenswrapper[4724]: I1002 13:01:20.871267 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/499ba1de-99e3-4a0c-be96-866d0127402d-metrics-tls\") pod \"dns-operator-744455d44c-4lzm2\" (UID: \"499ba1de-99e3-4a0c-be96-866d0127402d\") " pod="openshift-dns-operator/dns-operator-744455d44c-4lzm2" Oct 02 13:01:20 crc kubenswrapper[4724]: I1002 13:01:20.871296 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/7d87bc9c-fb3d-42c1-94c9-ec34e34d0f16-images\") pod \"machine-config-operator-74547568cd-vg57j\" (UID: \"7d87bc9c-fb3d-42c1-94c9-ec34e34d0f16\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-vg57j" Oct 02 13:01:20 crc kubenswrapper[4724]: I1002 13:01:20.871415 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5111b397-09d5-4412-9135-2ea4914c00db-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-dlnq8\" (UID: \"5111b397-09d5-4412-9135-2ea4914c00db\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-dlnq8" Oct 02 13:01:20 crc kubenswrapper[4724]: I1002 13:01:20.871440 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/131c7969-a8c8-4cfc-b655-ac3d400fae1b-console-config\") pod \"console-f9d7485db-lvb24\" (UID: \"131c7969-a8c8-4cfc-b655-ac3d400fae1b\") " pod="openshift-console/console-f9d7485db-lvb24" Oct 02 13:01:20 crc kubenswrapper[4724]: I1002 13:01:20.871472 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6578bd1a-eaad-452a-adf5-7f3e34838677-metrics-certs\") pod \"router-default-5444994796-n5rln\" (UID: \"6578bd1a-eaad-452a-adf5-7f3e34838677\") " pod="openshift-ingress/router-default-5444994796-n5rln" Oct 02 13:01:20 crc kubenswrapper[4724]: I1002 13:01:20.871514 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5111b397-09d5-4412-9135-2ea4914c00db-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-dlnq8\" (UID: \"5111b397-09d5-4412-9135-2ea4914c00db\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-dlnq8" Oct 02 13:01:20 crc kubenswrapper[4724]: I1002 13:01:20.871697 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f5bb856b-60df-44d0-9979-906fc271f66e-config\") pod \"kube-apiserver-operator-766d6c64bb-qwb55\" (UID: \"f5bb856b-60df-44d0-9979-906fc271f66e\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-qwb55" Oct 02 13:01:20 crc kubenswrapper[4724]: I1002 13:01:20.871757 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/5a3d5592-2d16-4d75-a734-664f6dd16418-etcd-client\") pod \"etcd-operator-b45778765-9fscb\" (UID: \"5a3d5592-2d16-4d75-a734-664f6dd16418\") " pod="openshift-etcd-operator/etcd-operator-b45778765-9fscb" Oct 02 13:01:20 crc kubenswrapper[4724]: I1002 13:01:20.871868 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/6578bd1a-eaad-452a-adf5-7f3e34838677-stats-auth\") pod \"router-default-5444994796-n5rln\" (UID: \"6578bd1a-eaad-452a-adf5-7f3e34838677\") " pod="openshift-ingress/router-default-5444994796-n5rln" Oct 02 13:01:20 crc kubenswrapper[4724]: I1002 13:01:20.872076 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bm2h7\" (UniqueName: \"kubernetes.io/projected/7de6408f-76b4-4a9a-bf83-fe6c4e60848e-kube-api-access-bm2h7\") pod \"image-registry-697d97f7c8-v5w5d\" (UID: \"7de6408f-76b4-4a9a-bf83-fe6c4e60848e\") " pod="openshift-image-registry/image-registry-697d97f7c8-v5w5d" Oct 02 13:01:20 crc kubenswrapper[4724]: I1002 13:01:20.872120 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/6578bd1a-eaad-452a-adf5-7f3e34838677-default-certificate\") pod \"router-default-5444994796-n5rln\" (UID: \"6578bd1a-eaad-452a-adf5-7f3e34838677\") " pod="openshift-ingress/router-default-5444994796-n5rln" Oct 02 13:01:20 crc kubenswrapper[4724]: I1002 13:01:20.872382 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a9c024ff-b6a9-4b92-8c1d-debc51c10ec9-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-t8hf4\" (UID: \"a9c024ff-b6a9-4b92-8c1d-debc51c10ec9\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-t8hf4" Oct 02 13:01:20 crc kubenswrapper[4724]: I1002 13:01:20.872408 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1d28f605-0e15-46b5-9c71-b6a123c0e0ce-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-fh5fg\" (UID: \"1d28f605-0e15-46b5-9c71-b6a123c0e0ce\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-fh5fg" Oct 02 13:01:20 crc kubenswrapper[4724]: I1002 13:01:20.872456 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/59e1178f-be06-4966-9e77-031df1e58c1a-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-tj5h8\" (UID: \"59e1178f-be06-4966-9e77-031df1e58c1a\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-tj5h8" Oct 02 13:01:20 crc kubenswrapper[4724]: I1002 13:01:20.872474 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-85b8l\" (UniqueName: \"kubernetes.io/projected/499ba1de-99e3-4a0c-be96-866d0127402d-kube-api-access-85b8l\") pod \"dns-operator-744455d44c-4lzm2\" (UID: \"499ba1de-99e3-4a0c-be96-866d0127402d\") " pod="openshift-dns-operator/dns-operator-744455d44c-4lzm2" Oct 02 13:01:20 crc kubenswrapper[4724]: I1002 13:01:20.872500 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/7de6408f-76b4-4a9a-bf83-fe6c4e60848e-installation-pull-secrets\") pod \"image-registry-697d97f7c8-v5w5d\" (UID: \"7de6408f-76b4-4a9a-bf83-fe6c4e60848e\") " pod="openshift-image-registry/image-registry-697d97f7c8-v5w5d" Oct 02 13:01:20 crc kubenswrapper[4724]: I1002 13:01:20.872822 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mxxw8\" (UniqueName: \"kubernetes.io/projected/2b666d74-c6b0-4909-83af-2b736c0e032a-kube-api-access-mxxw8\") pod \"cluster-samples-operator-665b6dd947-w4dqz\" (UID: \"2b666d74-c6b0-4909-83af-2b736c0e032a\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-w4dqz" Oct 02 13:01:20 crc kubenswrapper[4724]: I1002 13:01:20.872884 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qckhw\" (UniqueName: \"kubernetes.io/projected/59e1178f-be06-4966-9e77-031df1e58c1a-kube-api-access-qckhw\") pod \"cluster-image-registry-operator-dc59b4c8b-tj5h8\" (UID: \"59e1178f-be06-4966-9e77-031df1e58c1a\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-tj5h8" Oct 02 13:01:20 crc kubenswrapper[4724]: I1002 13:01:20.872907 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/42570dd4-bfb7-43a7-9f10-bf0df9236925-profile-collector-cert\") pod \"catalog-operator-68c6474976-8zmbp\" (UID: \"42570dd4-bfb7-43a7-9f10-bf0df9236925\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-8zmbp" Oct 02 13:01:20 crc kubenswrapper[4724]: I1002 13:01:20.873836 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ac67382d-e26c-48c3-933e-19fecd4d5d49-config\") pod \"openshift-apiserver-operator-796bbdcf4f-jpfxn\" (UID: \"ac67382d-e26c-48c3-933e-19fecd4d5d49\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-jpfxn" Oct 02 13:01:20 crc kubenswrapper[4724]: I1002 13:01:20.875718 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7hq8j\" (UniqueName: \"kubernetes.io/projected/7d87bc9c-fb3d-42c1-94c9-ec34e34d0f16-kube-api-access-7hq8j\") pod \"machine-config-operator-74547568cd-vg57j\" (UID: \"7d87bc9c-fb3d-42c1-94c9-ec34e34d0f16\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-vg57j" Oct 02 13:01:20 crc kubenswrapper[4724]: I1002 13:01:20.875758 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/5a3d5592-2d16-4d75-a734-664f6dd16418-etcd-service-ca\") pod \"etcd-operator-b45778765-9fscb\" (UID: \"5a3d5592-2d16-4d75-a734-664f6dd16418\") " pod="openshift-etcd-operator/etcd-operator-b45778765-9fscb" Oct 02 13:01:20 crc kubenswrapper[4724]: I1002 13:01:20.878258 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/7de6408f-76b4-4a9a-bf83-fe6c4e60848e-trusted-ca\") pod \"image-registry-697d97f7c8-v5w5d\" (UID: \"7de6408f-76b4-4a9a-bf83-fe6c4e60848e\") " pod="openshift-image-registry/image-registry-697d97f7c8-v5w5d" Oct 02 13:01:20 crc kubenswrapper[4724]: I1002 13:01:20.878315 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/2b330608-20dc-445e-bf75-4393541c7fd4-trusted-ca\") pod \"ingress-operator-5b745b69d9-hdbqt\" (UID: \"2b330608-20dc-445e-bf75-4393541c7fd4\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-hdbqt" Oct 02 13:01:20 crc kubenswrapper[4724]: I1002 13:01:20.878364 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/59e1178f-be06-4966-9e77-031df1e58c1a-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-tj5h8\" (UID: \"59e1178f-be06-4966-9e77-031df1e58c1a\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-tj5h8" Oct 02 13:01:20 crc kubenswrapper[4724]: I1002 13:01:20.878906 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zpxzz\" (UniqueName: \"kubernetes.io/projected/152155f3-933d-43c5-abeb-7c06899a6939-kube-api-access-zpxzz\") pod \"console-operator-58897d9998-v94bt\" (UID: \"152155f3-933d-43c5-abeb-7c06899a6939\") " pod="openshift-console-operator/console-operator-58897d9998-v94bt" Oct 02 13:01:20 crc kubenswrapper[4724]: I1002 13:01:20.879213 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5a3d5592-2d16-4d75-a734-664f6dd16418-serving-cert\") pod \"etcd-operator-b45778765-9fscb\" (UID: \"5a3d5592-2d16-4d75-a734-664f6dd16418\") " pod="openshift-etcd-operator/etcd-operator-b45778765-9fscb" Oct 02 13:01:20 crc kubenswrapper[4724]: I1002 13:01:20.879398 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/2b330608-20dc-445e-bf75-4393541c7fd4-bound-sa-token\") pod \"ingress-operator-5b745b69d9-hdbqt\" (UID: \"2b330608-20dc-445e-bf75-4393541c7fd4\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-hdbqt" Oct 02 13:01:20 crc kubenswrapper[4724]: I1002 13:01:20.879663 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/2b666d74-c6b0-4909-83af-2b736c0e032a-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-w4dqz\" (UID: \"2b666d74-c6b0-4909-83af-2b736c0e032a\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-w4dqz" Oct 02 13:01:20 crc kubenswrapper[4724]: I1002 13:01:20.879710 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a9c024ff-b6a9-4b92-8c1d-debc51c10ec9-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-t8hf4\" (UID: \"a9c024ff-b6a9-4b92-8c1d-debc51c10ec9\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-t8hf4" Oct 02 13:01:20 crc kubenswrapper[4724]: I1002 13:01:20.983194 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 13:01:20 crc kubenswrapper[4724]: E1002 13:01:20.983522 4724 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 13:01:21.483426919 +0000 UTC m=+145.938186040 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 13:01:20 crc kubenswrapper[4724]: I1002 13:01:20.983676 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/7de6408f-76b4-4a9a-bf83-fe6c4e60848e-bound-sa-token\") pod \"image-registry-697d97f7c8-v5w5d\" (UID: \"7de6408f-76b4-4a9a-bf83-fe6c4e60848e\") " pod="openshift-image-registry/image-registry-697d97f7c8-v5w5d" Oct 02 13:01:20 crc kubenswrapper[4724]: I1002 13:01:20.983710 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/5a3d5592-2d16-4d75-a734-664f6dd16418-etcd-ca\") pod \"etcd-operator-b45778765-9fscb\" (UID: \"5a3d5592-2d16-4d75-a734-664f6dd16418\") " pod="openshift-etcd-operator/etcd-operator-b45778765-9fscb" Oct 02 13:01:20 crc kubenswrapper[4724]: I1002 13:01:20.983742 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/59bd4d63-57bf-4b08-b7f1-f0c7d733e571-config-volume\") pod \"collect-profiles-29323500-vz2xt\" (UID: \"59bd4d63-57bf-4b08-b7f1-f0c7d733e571\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323500-vz2xt" Oct 02 13:01:20 crc kubenswrapper[4724]: I1002 13:01:20.983768 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g8xvl\" (UniqueName: \"kubernetes.io/projected/b75eedff-1c9f-456b-800e-6eebbf0db535-kube-api-access-g8xvl\") pod \"service-ca-operator-777779d784-klq6x\" (UID: \"b75eedff-1c9f-456b-800e-6eebbf0db535\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-klq6x" Oct 02 13:01:20 crc kubenswrapper[4724]: I1002 13:01:20.983794 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/131c7969-a8c8-4cfc-b655-ac3d400fae1b-service-ca\") pod \"console-f9d7485db-lvb24\" (UID: \"131c7969-a8c8-4cfc-b655-ac3d400fae1b\") " pod="openshift-console/console-f9d7485db-lvb24" Oct 02 13:01:20 crc kubenswrapper[4724]: I1002 13:01:20.983833 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dthqw\" (UniqueName: \"kubernetes.io/projected/2b330608-20dc-445e-bf75-4393541c7fd4-kube-api-access-dthqw\") pod \"ingress-operator-5b745b69d9-hdbqt\" (UID: \"2b330608-20dc-445e-bf75-4393541c7fd4\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-hdbqt" Oct 02 13:01:20 crc kubenswrapper[4724]: I1002 13:01:20.983855 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ac67382d-e26c-48c3-933e-19fecd4d5d49-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-jpfxn\" (UID: \"ac67382d-e26c-48c3-933e-19fecd4d5d49\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-jpfxn" Oct 02 13:01:20 crc kubenswrapper[4724]: I1002 13:01:20.983879 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4wjlh\" (UniqueName: \"kubernetes.io/projected/f0cb7f77-88ee-46c4-9c81-aa953416aec1-kube-api-access-4wjlh\") pod \"olm-operator-6b444d44fb-d2rqx\" (UID: \"f0cb7f77-88ee-46c4-9c81-aa953416aec1\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-d2rqx" Oct 02 13:01:20 crc kubenswrapper[4724]: I1002 13:01:20.983898 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/131c7969-a8c8-4cfc-b655-ac3d400fae1b-console-oauth-config\") pod \"console-f9d7485db-lvb24\" (UID: \"131c7969-a8c8-4cfc-b655-ac3d400fae1b\") " pod="openshift-console/console-f9d7485db-lvb24" Oct 02 13:01:20 crc kubenswrapper[4724]: I1002 13:01:20.983918 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/42570dd4-bfb7-43a7-9f10-bf0df9236925-srv-cert\") pod \"catalog-operator-68c6474976-8zmbp\" (UID: \"42570dd4-bfb7-43a7-9f10-bf0df9236925\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-8zmbp" Oct 02 13:01:20 crc kubenswrapper[4724]: I1002 13:01:20.983939 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7nnlj\" (UniqueName: \"kubernetes.io/projected/c8812b3c-bd75-49a8-b2a7-3db91675fc09-kube-api-access-7nnlj\") pod \"csi-hostpathplugin-cwvv5\" (UID: \"c8812b3c-bd75-49a8-b2a7-3db91675fc09\") " pod="hostpath-provisioner/csi-hostpathplugin-cwvv5" Oct 02 13:01:20 crc kubenswrapper[4724]: I1002 13:01:20.983962 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a9c024ff-b6a9-4b92-8c1d-debc51c10ec9-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-t8hf4\" (UID: \"a9c024ff-b6a9-4b92-8c1d-debc51c10ec9\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-t8hf4" Oct 02 13:01:20 crc kubenswrapper[4724]: I1002 13:01:20.983997 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-29qtp\" (UniqueName: \"kubernetes.io/projected/5111b397-09d5-4412-9135-2ea4914c00db-kube-api-access-29qtp\") pod \"openshift-controller-manager-operator-756b6f6bc6-dlnq8\" (UID: \"5111b397-09d5-4412-9135-2ea4914c00db\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-dlnq8" Oct 02 13:01:20 crc kubenswrapper[4724]: I1002 13:01:20.984021 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/7d87bc9c-fb3d-42c1-94c9-ec34e34d0f16-auth-proxy-config\") pod \"machine-config-operator-74547568cd-vg57j\" (UID: \"7d87bc9c-fb3d-42c1-94c9-ec34e34d0f16\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-vg57j" Oct 02 13:01:20 crc kubenswrapper[4724]: I1002 13:01:20.984041 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f5bb856b-60df-44d0-9979-906fc271f66e-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-qwb55\" (UID: \"f5bb856b-60df-44d0-9979-906fc271f66e\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-qwb55" Oct 02 13:01:20 crc kubenswrapper[4724]: I1002 13:01:20.984072 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5l7td\" (UniqueName: \"kubernetes.io/projected/42570dd4-bfb7-43a7-9f10-bf0df9236925-kube-api-access-5l7td\") pod \"catalog-operator-68c6474976-8zmbp\" (UID: \"42570dd4-bfb7-43a7-9f10-bf0df9236925\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-8zmbp" Oct 02 13:01:20 crc kubenswrapper[4724]: I1002 13:01:20.984095 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/7de6408f-76b4-4a9a-bf83-fe6c4e60848e-ca-trust-extracted\") pod \"image-registry-697d97f7c8-v5w5d\" (UID: \"7de6408f-76b4-4a9a-bf83-fe6c4e60848e\") " pod="openshift-image-registry/image-registry-697d97f7c8-v5w5d" Oct 02 13:01:20 crc kubenswrapper[4724]: I1002 13:01:20.984114 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/131c7969-a8c8-4cfc-b655-ac3d400fae1b-trusted-ca-bundle\") pod \"console-f9d7485db-lvb24\" (UID: \"131c7969-a8c8-4cfc-b655-ac3d400fae1b\") " pod="openshift-console/console-f9d7485db-lvb24" Oct 02 13:01:20 crc kubenswrapper[4724]: I1002 13:01:20.984134 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/59e1178f-be06-4966-9e77-031df1e58c1a-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-tj5h8\" (UID: \"59e1178f-be06-4966-9e77-031df1e58c1a\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-tj5h8" Oct 02 13:01:20 crc kubenswrapper[4724]: I1002 13:01:20.984159 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-blhd5\" (UniqueName: \"kubernetes.io/projected/a38ec3fe-c0db-4d6d-94a0-f4c5dfbc33c1-kube-api-access-blhd5\") pod \"package-server-manager-789f6589d5-qbbfj\" (UID: \"a38ec3fe-c0db-4d6d-94a0-f4c5dfbc33c1\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-qbbfj" Oct 02 13:01:20 crc kubenswrapper[4724]: I1002 13:01:20.984181 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/2b330608-20dc-445e-bf75-4393541c7fd4-metrics-tls\") pod \"ingress-operator-5b745b69d9-hdbqt\" (UID: \"2b330608-20dc-445e-bf75-4393541c7fd4\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-hdbqt" Oct 02 13:01:20 crc kubenswrapper[4724]: I1002 13:01:20.984203 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1d28f605-0e15-46b5-9c71-b6a123c0e0ce-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-fh5fg\" (UID: \"1d28f605-0e15-46b5-9c71-b6a123c0e0ce\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-fh5fg" Oct 02 13:01:20 crc kubenswrapper[4724]: I1002 13:01:20.984229 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mrtn7\" (UniqueName: \"kubernetes.io/projected/21e8cad3-fa39-40e9-9d04-ff4dd12c3ec9-kube-api-access-mrtn7\") pod \"migrator-59844c95c7-f2jql\" (UID: \"21e8cad3-fa39-40e9-9d04-ff4dd12c3ec9\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-f2jql" Oct 02 13:01:20 crc kubenswrapper[4724]: I1002 13:01:20.984252 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/59bd4d63-57bf-4b08-b7f1-f0c7d733e571-secret-volume\") pod \"collect-profiles-29323500-vz2xt\" (UID: \"59bd4d63-57bf-4b08-b7f1-f0c7d733e571\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323500-vz2xt" Oct 02 13:01:20 crc kubenswrapper[4724]: I1002 13:01:20.984275 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jhddp\" (UniqueName: \"kubernetes.io/projected/324d6aa8-e27f-49ab-9b8e-c72665cc2f34-kube-api-access-jhddp\") pod \"ingress-canary-lqm4v\" (UID: \"324d6aa8-e27f-49ab-9b8e-c72665cc2f34\") " pod="openshift-ingress-canary/ingress-canary-lqm4v" Oct 02 13:01:20 crc kubenswrapper[4724]: I1002 13:01:20.984300 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/499ba1de-99e3-4a0c-be96-866d0127402d-metrics-tls\") pod \"dns-operator-744455d44c-4lzm2\" (UID: \"499ba1de-99e3-4a0c-be96-866d0127402d\") " pod="openshift-dns-operator/dns-operator-744455d44c-4lzm2" Oct 02 13:01:20 crc kubenswrapper[4724]: I1002 13:01:20.984322 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rp2gb\" (UniqueName: \"kubernetes.io/projected/89edffbe-3576-4a6e-8bd6-884ee6fbc58d-kube-api-access-rp2gb\") pod \"kube-storage-version-migrator-operator-b67b599dd-z6hlf\" (UID: \"89edffbe-3576-4a6e-8bd6-884ee6fbc58d\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-z6hlf" Oct 02 13:01:20 crc kubenswrapper[4724]: I1002 13:01:20.984343 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/c8812b3c-bd75-49a8-b2a7-3db91675fc09-registration-dir\") pod \"csi-hostpathplugin-cwvv5\" (UID: \"c8812b3c-bd75-49a8-b2a7-3db91675fc09\") " pod="hostpath-provisioner/csi-hostpathplugin-cwvv5" Oct 02 13:01:20 crc kubenswrapper[4724]: I1002 13:01:20.984368 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5111b397-09d5-4412-9135-2ea4914c00db-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-dlnq8\" (UID: \"5111b397-09d5-4412-9135-2ea4914c00db\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-dlnq8" Oct 02 13:01:20 crc kubenswrapper[4724]: I1002 13:01:20.984389 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/131c7969-a8c8-4cfc-b655-ac3d400fae1b-console-config\") pod \"console-f9d7485db-lvb24\" (UID: \"131c7969-a8c8-4cfc-b655-ac3d400fae1b\") " pod="openshift-console/console-f9d7485db-lvb24" Oct 02 13:01:20 crc kubenswrapper[4724]: I1002 13:01:20.984408 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/7d87bc9c-fb3d-42c1-94c9-ec34e34d0f16-images\") pod \"machine-config-operator-74547568cd-vg57j\" (UID: \"7d87bc9c-fb3d-42c1-94c9-ec34e34d0f16\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-vg57j" Oct 02 13:01:20 crc kubenswrapper[4724]: I1002 13:01:20.984432 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6578bd1a-eaad-452a-adf5-7f3e34838677-metrics-certs\") pod \"router-default-5444994796-n5rln\" (UID: \"6578bd1a-eaad-452a-adf5-7f3e34838677\") " pod="openshift-ingress/router-default-5444994796-n5rln" Oct 02 13:01:20 crc kubenswrapper[4724]: I1002 13:01:20.984436 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/5a3d5592-2d16-4d75-a734-664f6dd16418-etcd-ca\") pod \"etcd-operator-b45778765-9fscb\" (UID: \"5a3d5592-2d16-4d75-a734-664f6dd16418\") " pod="openshift-etcd-operator/etcd-operator-b45778765-9fscb" Oct 02 13:01:20 crc kubenswrapper[4724]: I1002 13:01:20.984456 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/8c9cecbe-09ba-4a38-a541-f897b225f416-proxy-tls\") pod \"machine-config-controller-84d6567774-5mxg5\" (UID: \"8c9cecbe-09ba-4a38-a541-f897b225f416\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-5mxg5" Oct 02 13:01:20 crc kubenswrapper[4724]: I1002 13:01:20.984481 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5111b397-09d5-4412-9135-2ea4914c00db-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-dlnq8\" (UID: \"5111b397-09d5-4412-9135-2ea4914c00db\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-dlnq8" Oct 02 13:01:20 crc kubenswrapper[4724]: I1002 13:01:20.984522 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/c8812b3c-bd75-49a8-b2a7-3db91675fc09-mountpoint-dir\") pod \"csi-hostpathplugin-cwvv5\" (UID: \"c8812b3c-bd75-49a8-b2a7-3db91675fc09\") " pod="hostpath-provisioner/csi-hostpathplugin-cwvv5" Oct 02 13:01:20 crc kubenswrapper[4724]: I1002 13:01:20.984580 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f5bb856b-60df-44d0-9979-906fc271f66e-config\") pod \"kube-apiserver-operator-766d6c64bb-qwb55\" (UID: \"f5bb856b-60df-44d0-9979-906fc271f66e\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-qwb55" Oct 02 13:01:20 crc kubenswrapper[4724]: I1002 13:01:20.984603 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/324d6aa8-e27f-49ab-9b8e-c72665cc2f34-cert\") pod \"ingress-canary-lqm4v\" (UID: \"324d6aa8-e27f-49ab-9b8e-c72665cc2f34\") " pod="openshift-ingress-canary/ingress-canary-lqm4v" Oct 02 13:01:20 crc kubenswrapper[4724]: I1002 13:01:20.984625 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b75eedff-1c9f-456b-800e-6eebbf0db535-serving-cert\") pod \"service-ca-operator-777779d784-klq6x\" (UID: \"b75eedff-1c9f-456b-800e-6eebbf0db535\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-klq6x" Oct 02 13:01:20 crc kubenswrapper[4724]: I1002 13:01:20.984678 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/5a3d5592-2d16-4d75-a734-664f6dd16418-etcd-client\") pod \"etcd-operator-b45778765-9fscb\" (UID: \"5a3d5592-2d16-4d75-a734-664f6dd16418\") " pod="openshift-etcd-operator/etcd-operator-b45778765-9fscb" Oct 02 13:01:20 crc kubenswrapper[4724]: I1002 13:01:20.984704 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pnzrh\" (UniqueName: \"kubernetes.io/projected/8c9cecbe-09ba-4a38-a541-f897b225f416-kube-api-access-pnzrh\") pod \"machine-config-controller-84d6567774-5mxg5\" (UID: \"8c9cecbe-09ba-4a38-a541-f897b225f416\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-5mxg5" Oct 02 13:01:20 crc kubenswrapper[4724]: I1002 13:01:20.984729 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/6578bd1a-eaad-452a-adf5-7f3e34838677-stats-auth\") pod \"router-default-5444994796-n5rln\" (UID: \"6578bd1a-eaad-452a-adf5-7f3e34838677\") " pod="openshift-ingress/router-default-5444994796-n5rln" Oct 02 13:01:20 crc kubenswrapper[4724]: I1002 13:01:20.984755 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/89edffbe-3576-4a6e-8bd6-884ee6fbc58d-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-z6hlf\" (UID: \"89edffbe-3576-4a6e-8bd6-884ee6fbc58d\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-z6hlf" Oct 02 13:01:20 crc kubenswrapper[4724]: I1002 13:01:20.984794 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/a939e0cc-650f-4fb4-9a13-bcbf29ebdb76-signing-key\") pod \"service-ca-9c57cc56f-4rwfr\" (UID: \"a939e0cc-650f-4fb4-9a13-bcbf29ebdb76\") " pod="openshift-service-ca/service-ca-9c57cc56f-4rwfr" Oct 02 13:01:20 crc kubenswrapper[4724]: I1002 13:01:20.984819 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/6578bd1a-eaad-452a-adf5-7f3e34838677-default-certificate\") pod \"router-default-5444994796-n5rln\" (UID: \"6578bd1a-eaad-452a-adf5-7f3e34838677\") " pod="openshift-ingress/router-default-5444994796-n5rln" Oct 02 13:01:20 crc kubenswrapper[4724]: I1002 13:01:20.984825 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/7d87bc9c-fb3d-42c1-94c9-ec34e34d0f16-auth-proxy-config\") pod \"machine-config-operator-74547568cd-vg57j\" (UID: \"7d87bc9c-fb3d-42c1-94c9-ec34e34d0f16\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-vg57j" Oct 02 13:01:20 crc kubenswrapper[4724]: I1002 13:01:20.984842 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/89edffbe-3576-4a6e-8bd6-884ee6fbc58d-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-z6hlf\" (UID: \"89edffbe-3576-4a6e-8bd6-884ee6fbc58d\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-z6hlf" Oct 02 13:01:20 crc kubenswrapper[4724]: I1002 13:01:20.984863 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bm2h7\" (UniqueName: \"kubernetes.io/projected/7de6408f-76b4-4a9a-bf83-fe6c4e60848e-kube-api-access-bm2h7\") pod \"image-registry-697d97f7c8-v5w5d\" (UID: \"7de6408f-76b4-4a9a-bf83-fe6c4e60848e\") " pod="openshift-image-registry/image-registry-697d97f7c8-v5w5d" Oct 02 13:01:20 crc kubenswrapper[4724]: I1002 13:01:20.984887 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1d28f605-0e15-46b5-9c71-b6a123c0e0ce-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-fh5fg\" (UID: \"1d28f605-0e15-46b5-9c71-b6a123c0e0ce\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-fh5fg" Oct 02 13:01:20 crc kubenswrapper[4724]: I1002 13:01:20.984911 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a9c024ff-b6a9-4b92-8c1d-debc51c10ec9-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-t8hf4\" (UID: \"a9c024ff-b6a9-4b92-8c1d-debc51c10ec9\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-t8hf4" Oct 02 13:01:20 crc kubenswrapper[4724]: I1002 13:01:20.984934 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/59e1178f-be06-4966-9e77-031df1e58c1a-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-tj5h8\" (UID: \"59e1178f-be06-4966-9e77-031df1e58c1a\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-tj5h8" Oct 02 13:01:20 crc kubenswrapper[4724]: I1002 13:01:20.984955 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-85b8l\" (UniqueName: \"kubernetes.io/projected/499ba1de-99e3-4a0c-be96-866d0127402d-kube-api-access-85b8l\") pod \"dns-operator-744455d44c-4lzm2\" (UID: \"499ba1de-99e3-4a0c-be96-866d0127402d\") " pod="openshift-dns-operator/dns-operator-744455d44c-4lzm2" Oct 02 13:01:20 crc kubenswrapper[4724]: I1002 13:01:20.984977 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/7de6408f-76b4-4a9a-bf83-fe6c4e60848e-installation-pull-secrets\") pod \"image-registry-697d97f7c8-v5w5d\" (UID: \"7de6408f-76b4-4a9a-bf83-fe6c4e60848e\") " pod="openshift-image-registry/image-registry-697d97f7c8-v5w5d" Oct 02 13:01:20 crc kubenswrapper[4724]: I1002 13:01:20.985003 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ghc4r\" (UniqueName: \"kubernetes.io/projected/efa1155d-cd1a-496d-94e8-eecbee129061-kube-api-access-ghc4r\") pod \"control-plane-machine-set-operator-78cbb6b69f-wjrm6\" (UID: \"efa1155d-cd1a-496d-94e8-eecbee129061\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-wjrm6" Oct 02 13:01:20 crc kubenswrapper[4724]: I1002 13:01:20.985042 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/4e183eb5-2230-43b3-b8b8-b1c3aaa21370-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-xnt99\" (UID: \"4e183eb5-2230-43b3-b8b8-b1c3aaa21370\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-xnt99" Oct 02 13:01:20 crc kubenswrapper[4724]: I1002 13:01:20.985083 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mxxw8\" (UniqueName: \"kubernetes.io/projected/2b666d74-c6b0-4909-83af-2b736c0e032a-kube-api-access-mxxw8\") pod \"cluster-samples-operator-665b6dd947-w4dqz\" (UID: \"2b666d74-c6b0-4909-83af-2b736c0e032a\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-w4dqz" Oct 02 13:01:20 crc kubenswrapper[4724]: I1002 13:01:20.985108 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/c8812b3c-bd75-49a8-b2a7-3db91675fc09-csi-data-dir\") pod \"csi-hostpathplugin-cwvv5\" (UID: \"c8812b3c-bd75-49a8-b2a7-3db91675fc09\") " pod="hostpath-provisioner/csi-hostpathplugin-cwvv5" Oct 02 13:01:20 crc kubenswrapper[4724]: I1002 13:01:20.985133 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qckhw\" (UniqueName: \"kubernetes.io/projected/59e1178f-be06-4966-9e77-031df1e58c1a-kube-api-access-qckhw\") pod \"cluster-image-registry-operator-dc59b4c8b-tj5h8\" (UID: \"59e1178f-be06-4966-9e77-031df1e58c1a\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-tj5h8" Oct 02 13:01:20 crc kubenswrapper[4724]: I1002 13:01:20.985160 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/42570dd4-bfb7-43a7-9f10-bf0df9236925-profile-collector-cert\") pod \"catalog-operator-68c6474976-8zmbp\" (UID: \"42570dd4-bfb7-43a7-9f10-bf0df9236925\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-8zmbp" Oct 02 13:01:20 crc kubenswrapper[4724]: I1002 13:01:20.985190 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hpktr\" (UniqueName: \"kubernetes.io/projected/4e183eb5-2230-43b3-b8b8-b1c3aaa21370-kube-api-access-hpktr\") pod \"multus-admission-controller-857f4d67dd-xnt99\" (UID: \"4e183eb5-2230-43b3-b8b8-b1c3aaa21370\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-xnt99" Oct 02 13:01:20 crc kubenswrapper[4724]: I1002 13:01:20.985219 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ac67382d-e26c-48c3-933e-19fecd4d5d49-config\") pod \"openshift-apiserver-operator-796bbdcf4f-jpfxn\" (UID: \"ac67382d-e26c-48c3-933e-19fecd4d5d49\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-jpfxn" Oct 02 13:01:20 crc kubenswrapper[4724]: I1002 13:01:20.985242 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/8fa4c4f2-9475-482e-b428-c2ec0abc2842-config-volume\") pod \"dns-default-tq9w6\" (UID: \"8fa4c4f2-9475-482e-b428-c2ec0abc2842\") " pod="openshift-dns/dns-default-tq9w6" Oct 02 13:01:20 crc kubenswrapper[4724]: I1002 13:01:20.985285 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7hq8j\" (UniqueName: \"kubernetes.io/projected/7d87bc9c-fb3d-42c1-94c9-ec34e34d0f16-kube-api-access-7hq8j\") pod \"machine-config-operator-74547568cd-vg57j\" (UID: \"7d87bc9c-fb3d-42c1-94c9-ec34e34d0f16\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-vg57j" Oct 02 13:01:20 crc kubenswrapper[4724]: I1002 13:01:20.985308 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/5a3d5592-2d16-4d75-a734-664f6dd16418-etcd-service-ca\") pod \"etcd-operator-b45778765-9fscb\" (UID: \"5a3d5592-2d16-4d75-a734-664f6dd16418\") " pod="openshift-etcd-operator/etcd-operator-b45778765-9fscb" Oct 02 13:01:20 crc kubenswrapper[4724]: I1002 13:01:20.985333 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pd6zv\" (UniqueName: \"kubernetes.io/projected/59bd4d63-57bf-4b08-b7f1-f0c7d733e571-kube-api-access-pd6zv\") pod \"collect-profiles-29323500-vz2xt\" (UID: \"59bd4d63-57bf-4b08-b7f1-f0c7d733e571\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323500-vz2xt" Oct 02 13:01:20 crc kubenswrapper[4724]: I1002 13:01:20.985370 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/7de6408f-76b4-4a9a-bf83-fe6c4e60848e-trusted-ca\") pod \"image-registry-697d97f7c8-v5w5d\" (UID: \"7de6408f-76b4-4a9a-bf83-fe6c4e60848e\") " pod="openshift-image-registry/image-registry-697d97f7c8-v5w5d" Oct 02 13:01:20 crc kubenswrapper[4724]: I1002 13:01:20.985405 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/2b330608-20dc-445e-bf75-4393541c7fd4-trusted-ca\") pod \"ingress-operator-5b745b69d9-hdbqt\" (UID: \"2b330608-20dc-445e-bf75-4393541c7fd4\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-hdbqt" Oct 02 13:01:20 crc kubenswrapper[4724]: I1002 13:01:20.985422 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/131c7969-a8c8-4cfc-b655-ac3d400fae1b-service-ca\") pod \"console-f9d7485db-lvb24\" (UID: \"131c7969-a8c8-4cfc-b655-ac3d400fae1b\") " pod="openshift-console/console-f9d7485db-lvb24" Oct 02 13:01:20 crc kubenswrapper[4724]: I1002 13:01:20.985430 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/59e1178f-be06-4966-9e77-031df1e58c1a-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-tj5h8\" (UID: \"59e1178f-be06-4966-9e77-031df1e58c1a\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-tj5h8" Oct 02 13:01:20 crc kubenswrapper[4724]: I1002 13:01:20.985454 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f56s4\" (UniqueName: \"kubernetes.io/projected/8fa4c4f2-9475-482e-b428-c2ec0abc2842-kube-api-access-f56s4\") pod \"dns-default-tq9w6\" (UID: \"8fa4c4f2-9475-482e-b428-c2ec0abc2842\") " pod="openshift-dns/dns-default-tq9w6" Oct 02 13:01:20 crc kubenswrapper[4724]: I1002 13:01:20.985478 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/98449ccf-cf29-44ab-9400-994b04309bb5-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-j7cp6\" (UID: \"98449ccf-cf29-44ab-9400-994b04309bb5\") " pod="openshift-marketplace/marketplace-operator-79b997595-j7cp6" Oct 02 13:01:20 crc kubenswrapper[4724]: I1002 13:01:20.985503 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zpxzz\" (UniqueName: \"kubernetes.io/projected/152155f3-933d-43c5-abeb-7c06899a6939-kube-api-access-zpxzz\") pod \"console-operator-58897d9998-v94bt\" (UID: \"152155f3-933d-43c5-abeb-7c06899a6939\") " pod="openshift-console-operator/console-operator-58897d9998-v94bt" Oct 02 13:01:20 crc kubenswrapper[4724]: I1002 13:01:20.985526 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/8c9cecbe-09ba-4a38-a541-f897b225f416-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-5mxg5\" (UID: \"8c9cecbe-09ba-4a38-a541-f897b225f416\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-5mxg5" Oct 02 13:01:20 crc kubenswrapper[4724]: I1002 13:01:20.985565 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/0fe37aaa-cc86-416c-9718-97b43f158977-tmpfs\") pod \"packageserver-d55dfcdfc-m6knf\" (UID: \"0fe37aaa-cc86-416c-9718-97b43f158977\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-m6knf" Oct 02 13:01:20 crc kubenswrapper[4724]: I1002 13:01:20.985590 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/aa3fb3e7-b85a-4ba1-a075-4d3ae0c6eed6-node-bootstrap-token\") pod \"machine-config-server-kw2qg\" (UID: \"aa3fb3e7-b85a-4ba1-a075-4d3ae0c6eed6\") " pod="openshift-machine-config-operator/machine-config-server-kw2qg" Oct 02 13:01:20 crc kubenswrapper[4724]: I1002 13:01:20.985633 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5a3d5592-2d16-4d75-a734-664f6dd16418-serving-cert\") pod \"etcd-operator-b45778765-9fscb\" (UID: \"5a3d5592-2d16-4d75-a734-664f6dd16418\") " pod="openshift-etcd-operator/etcd-operator-b45778765-9fscb" Oct 02 13:01:20 crc kubenswrapper[4724]: I1002 13:01:20.985658 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/a38ec3fe-c0db-4d6d-94a0-f4c5dfbc33c1-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-qbbfj\" (UID: \"a38ec3fe-c0db-4d6d-94a0-f4c5dfbc33c1\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-qbbfj" Oct 02 13:01:20 crc kubenswrapper[4724]: I1002 13:01:20.985681 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/aa3fb3e7-b85a-4ba1-a075-4d3ae0c6eed6-certs\") pod \"machine-config-server-kw2qg\" (UID: \"aa3fb3e7-b85a-4ba1-a075-4d3ae0c6eed6\") " pod="openshift-machine-config-operator/machine-config-server-kw2qg" Oct 02 13:01:20 crc kubenswrapper[4724]: I1002 13:01:20.985701 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5dlhl\" (UniqueName: \"kubernetes.io/projected/98449ccf-cf29-44ab-9400-994b04309bb5-kube-api-access-5dlhl\") pod \"marketplace-operator-79b997595-j7cp6\" (UID: \"98449ccf-cf29-44ab-9400-994b04309bb5\") " pod="openshift-marketplace/marketplace-operator-79b997595-j7cp6" Oct 02 13:01:20 crc kubenswrapper[4724]: I1002 13:01:20.985728 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/2b330608-20dc-445e-bf75-4393541c7fd4-bound-sa-token\") pod \"ingress-operator-5b745b69d9-hdbqt\" (UID: \"2b330608-20dc-445e-bf75-4393541c7fd4\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-hdbqt" Oct 02 13:01:20 crc kubenswrapper[4724]: I1002 13:01:20.985750 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sjs4f\" (UniqueName: \"kubernetes.io/projected/a939e0cc-650f-4fb4-9a13-bcbf29ebdb76-kube-api-access-sjs4f\") pod \"service-ca-9c57cc56f-4rwfr\" (UID: \"a939e0cc-650f-4fb4-9a13-bcbf29ebdb76\") " pod="openshift-service-ca/service-ca-9c57cc56f-4rwfr" Oct 02 13:01:20 crc kubenswrapper[4724]: I1002 13:01:20.985803 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/2b666d74-c6b0-4909-83af-2b736c0e032a-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-w4dqz\" (UID: \"2b666d74-c6b0-4909-83af-2b736c0e032a\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-w4dqz" Oct 02 13:01:20 crc kubenswrapper[4724]: I1002 13:01:20.985827 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a9c024ff-b6a9-4b92-8c1d-debc51c10ec9-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-t8hf4\" (UID: \"a9c024ff-b6a9-4b92-8c1d-debc51c10ec9\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-t8hf4" Oct 02 13:01:20 crc kubenswrapper[4724]: I1002 13:01:20.985850 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/8fa4c4f2-9475-482e-b428-c2ec0abc2842-metrics-tls\") pod \"dns-default-tq9w6\" (UID: \"8fa4c4f2-9475-482e-b428-c2ec0abc2842\") " pod="openshift-dns/dns-default-tq9w6" Oct 02 13:01:20 crc kubenswrapper[4724]: I1002 13:01:20.985885 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/c8812b3c-bd75-49a8-b2a7-3db91675fc09-plugins-dir\") pod \"csi-hostpathplugin-cwvv5\" (UID: \"c8812b3c-bd75-49a8-b2a7-3db91675fc09\") " pod="hostpath-provisioner/csi-hostpathplugin-cwvv5" Oct 02 13:01:20 crc kubenswrapper[4724]: I1002 13:01:20.985910 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/f5bb856b-60df-44d0-9979-906fc271f66e-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-qwb55\" (UID: \"f5bb856b-60df-44d0-9979-906fc271f66e\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-qwb55" Oct 02 13:01:20 crc kubenswrapper[4724]: I1002 13:01:20.985948 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/131c7969-a8c8-4cfc-b655-ac3d400fae1b-oauth-serving-cert\") pod \"console-f9d7485db-lvb24\" (UID: \"131c7969-a8c8-4cfc-b655-ac3d400fae1b\") " pod="openshift-console/console-f9d7485db-lvb24" Oct 02 13:01:20 crc kubenswrapper[4724]: I1002 13:01:20.985971 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/0fe37aaa-cc86-416c-9718-97b43f158977-apiservice-cert\") pod \"packageserver-d55dfcdfc-m6knf\" (UID: \"0fe37aaa-cc86-416c-9718-97b43f158977\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-m6knf" Oct 02 13:01:20 crc kubenswrapper[4724]: I1002 13:01:20.985992 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/c8812b3c-bd75-49a8-b2a7-3db91675fc09-socket-dir\") pod \"csi-hostpathplugin-cwvv5\" (UID: \"c8812b3c-bd75-49a8-b2a7-3db91675fc09\") " pod="hostpath-provisioner/csi-hostpathplugin-cwvv5" Oct 02 13:01:20 crc kubenswrapper[4724]: I1002 13:01:20.986016 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6j7th\" (UniqueName: \"kubernetes.io/projected/0fe37aaa-cc86-416c-9718-97b43f158977-kube-api-access-6j7th\") pod \"packageserver-d55dfcdfc-m6knf\" (UID: \"0fe37aaa-cc86-416c-9718-97b43f158977\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-m6knf" Oct 02 13:01:20 crc kubenswrapper[4724]: I1002 13:01:20.986043 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-v5w5d\" (UID: \"7de6408f-76b4-4a9a-bf83-fe6c4e60848e\") " pod="openshift-image-registry/image-registry-697d97f7c8-v5w5d" Oct 02 13:01:20 crc kubenswrapper[4724]: I1002 13:01:20.986067 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/152155f3-933d-43c5-abeb-7c06899a6939-trusted-ca\") pod \"console-operator-58897d9998-v94bt\" (UID: \"152155f3-933d-43c5-abeb-7c06899a6939\") " pod="openshift-console-operator/console-operator-58897d9998-v94bt" Oct 02 13:01:20 crc kubenswrapper[4724]: I1002 13:01:20.986092 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rn974\" (UniqueName: \"kubernetes.io/projected/ac67382d-e26c-48c3-933e-19fecd4d5d49-kube-api-access-rn974\") pod \"openshift-apiserver-operator-796bbdcf4f-jpfxn\" (UID: \"ac67382d-e26c-48c3-933e-19fecd4d5d49\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-jpfxn" Oct 02 13:01:20 crc kubenswrapper[4724]: I1002 13:01:20.986139 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/7d87bc9c-fb3d-42c1-94c9-ec34e34d0f16-proxy-tls\") pod \"machine-config-operator-74547568cd-vg57j\" (UID: \"7d87bc9c-fb3d-42c1-94c9-ec34e34d0f16\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-vg57j" Oct 02 13:01:20 crc kubenswrapper[4724]: I1002 13:01:20.986163 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6578bd1a-eaad-452a-adf5-7f3e34838677-service-ca-bundle\") pod \"router-default-5444994796-n5rln\" (UID: \"6578bd1a-eaad-452a-adf5-7f3e34838677\") " pod="openshift-ingress/router-default-5444994796-n5rln" Oct 02 13:01:20 crc kubenswrapper[4724]: I1002 13:01:20.986187 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/efa1155d-cd1a-496d-94e8-eecbee129061-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-wjrm6\" (UID: \"efa1155d-cd1a-496d-94e8-eecbee129061\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-wjrm6" Oct 02 13:01:20 crc kubenswrapper[4724]: I1002 13:01:20.986239 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f0cb7f77-88ee-46c4-9c81-aa953416aec1-srv-cert\") pod \"olm-operator-6b444d44fb-d2rqx\" (UID: \"f0cb7f77-88ee-46c4-9c81-aa953416aec1\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-d2rqx" Oct 02 13:01:20 crc kubenswrapper[4724]: I1002 13:01:20.986263 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f0cb7f77-88ee-46c4-9c81-aa953416aec1-profile-collector-cert\") pod \"olm-operator-6b444d44fb-d2rqx\" (UID: \"f0cb7f77-88ee-46c4-9c81-aa953416aec1\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-d2rqx" Oct 02 13:01:20 crc kubenswrapper[4724]: I1002 13:01:20.986261 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/7de6408f-76b4-4a9a-bf83-fe6c4e60848e-ca-trust-extracted\") pod \"image-registry-697d97f7c8-v5w5d\" (UID: \"7de6408f-76b4-4a9a-bf83-fe6c4e60848e\") " pod="openshift-image-registry/image-registry-697d97f7c8-v5w5d" Oct 02 13:01:20 crc kubenswrapper[4724]: I1002 13:01:20.986285 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b75eedff-1c9f-456b-800e-6eebbf0db535-config\") pod \"service-ca-operator-777779d784-klq6x\" (UID: \"b75eedff-1c9f-456b-800e-6eebbf0db535\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-klq6x" Oct 02 13:01:20 crc kubenswrapper[4724]: I1002 13:01:20.986495 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/131c7969-a8c8-4cfc-b655-ac3d400fae1b-console-serving-cert\") pod \"console-f9d7485db-lvb24\" (UID: \"131c7969-a8c8-4cfc-b655-ac3d400fae1b\") " pod="openshift-console/console-f9d7485db-lvb24" Oct 02 13:01:20 crc kubenswrapper[4724]: I1002 13:01:20.986688 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/7de6408f-76b4-4a9a-bf83-fe6c4e60848e-registry-tls\") pod \"image-registry-697d97f7c8-v5w5d\" (UID: \"7de6408f-76b4-4a9a-bf83-fe6c4e60848e\") " pod="openshift-image-registry/image-registry-697d97f7c8-v5w5d" Oct 02 13:01:20 crc kubenswrapper[4724]: I1002 13:01:20.986731 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1d28f605-0e15-46b5-9c71-b6a123c0e0ce-config\") pod \"kube-controller-manager-operator-78b949d7b-fh5fg\" (UID: \"1d28f605-0e15-46b5-9c71-b6a123c0e0ce\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-fh5fg" Oct 02 13:01:20 crc kubenswrapper[4724]: I1002 13:01:20.988755 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/131c7969-a8c8-4cfc-b655-ac3d400fae1b-trusted-ca-bundle\") pod \"console-f9d7485db-lvb24\" (UID: \"131c7969-a8c8-4cfc-b655-ac3d400fae1b\") " pod="openshift-console/console-f9d7485db-lvb24" Oct 02 13:01:20 crc kubenswrapper[4724]: I1002 13:01:20.990768 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5111b397-09d5-4412-9135-2ea4914c00db-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-dlnq8\" (UID: \"5111b397-09d5-4412-9135-2ea4914c00db\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-dlnq8" Oct 02 13:01:20 crc kubenswrapper[4724]: I1002 13:01:20.992222 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/499ba1de-99e3-4a0c-be96-866d0127402d-metrics-tls\") pod \"dns-operator-744455d44c-4lzm2\" (UID: \"499ba1de-99e3-4a0c-be96-866d0127402d\") " pod="openshift-dns-operator/dns-operator-744455d44c-4lzm2" Oct 02 13:01:20 crc kubenswrapper[4724]: I1002 13:01:20.992322 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/7de6408f-76b4-4a9a-bf83-fe6c4e60848e-registry-tls\") pod \"image-registry-697d97f7c8-v5w5d\" (UID: \"7de6408f-76b4-4a9a-bf83-fe6c4e60848e\") " pod="openshift-image-registry/image-registry-697d97f7c8-v5w5d" Oct 02 13:01:20 crc kubenswrapper[4724]: I1002 13:01:20.993115 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/5a3d5592-2d16-4d75-a734-664f6dd16418-etcd-service-ca\") pod \"etcd-operator-b45778765-9fscb\" (UID: \"5a3d5592-2d16-4d75-a734-664f6dd16418\") " pod="openshift-etcd-operator/etcd-operator-b45778765-9fscb" Oct 02 13:01:20 crc kubenswrapper[4724]: I1002 13:01:20.994212 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/59e1178f-be06-4966-9e77-031df1e58c1a-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-tj5h8\" (UID: \"59e1178f-be06-4966-9e77-031df1e58c1a\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-tj5h8" Oct 02 13:01:20 crc kubenswrapper[4724]: I1002 13:01:20.994547 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/131c7969-a8c8-4cfc-b655-ac3d400fae1b-console-oauth-config\") pod \"console-f9d7485db-lvb24\" (UID: \"131c7969-a8c8-4cfc-b655-ac3d400fae1b\") " pod="openshift-console/console-f9d7485db-lvb24" Oct 02 13:01:20 crc kubenswrapper[4724]: E1002 13:01:20.994862 4724 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 13:01:21.494841267 +0000 UTC m=+145.949600388 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-v5w5d" (UID: "7de6408f-76b4-4a9a-bf83-fe6c4e60848e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 13:01:20 crc kubenswrapper[4724]: I1002 13:01:20.995459 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/131c7969-a8c8-4cfc-b655-ac3d400fae1b-console-config\") pod \"console-f9d7485db-lvb24\" (UID: \"131c7969-a8c8-4cfc-b655-ac3d400fae1b\") " pod="openshift-console/console-f9d7485db-lvb24" Oct 02 13:01:20 crc kubenswrapper[4724]: I1002 13:01:20.995491 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/7de6408f-76b4-4a9a-bf83-fe6c4e60848e-trusted-ca\") pod \"image-registry-697d97f7c8-v5w5d\" (UID: \"7de6408f-76b4-4a9a-bf83-fe6c4e60848e\") " pod="openshift-image-registry/image-registry-697d97f7c8-v5w5d" Oct 02 13:01:20 crc kubenswrapper[4724]: I1002 13:01:20.996191 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f5bb856b-60df-44d0-9979-906fc271f66e-config\") pod \"kube-apiserver-operator-766d6c64bb-qwb55\" (UID: \"f5bb856b-60df-44d0-9979-906fc271f66e\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-qwb55" Oct 02 13:01:20 crc kubenswrapper[4724]: I1002 13:01:20.996905 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/2b330608-20dc-445e-bf75-4393541c7fd4-trusted-ca\") pod \"ingress-operator-5b745b69d9-hdbqt\" (UID: \"2b330608-20dc-445e-bf75-4393541c7fd4\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-hdbqt" Oct 02 13:01:20 crc kubenswrapper[4724]: I1002 13:01:20.998265 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/59e1178f-be06-4966-9e77-031df1e58c1a-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-tj5h8\" (UID: \"59e1178f-be06-4966-9e77-031df1e58c1a\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-tj5h8" Oct 02 13:01:20 crc kubenswrapper[4724]: I1002 13:01:20.998550 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/7d87bc9c-fb3d-42c1-94c9-ec34e34d0f16-images\") pod \"machine-config-operator-74547568cd-vg57j\" (UID: \"7d87bc9c-fb3d-42c1-94c9-ec34e34d0f16\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-vg57j" Oct 02 13:01:20 crc kubenswrapper[4724]: I1002 13:01:20.998817 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ac67382d-e26c-48c3-933e-19fecd4d5d49-config\") pod \"openshift-apiserver-operator-796bbdcf4f-jpfxn\" (UID: \"ac67382d-e26c-48c3-933e-19fecd4d5d49\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-jpfxn" Oct 02 13:01:20 crc kubenswrapper[4724]: I1002 13:01:20.999629 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/7de6408f-76b4-4a9a-bf83-fe6c4e60848e-installation-pull-secrets\") pod \"image-registry-697d97f7c8-v5w5d\" (UID: \"7de6408f-76b4-4a9a-bf83-fe6c4e60848e\") " pod="openshift-image-registry/image-registry-697d97f7c8-v5w5d" Oct 02 13:01:21 crc kubenswrapper[4724]: I1002 13:01:21.000441 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6578bd1a-eaad-452a-adf5-7f3e34838677-service-ca-bundle\") pod \"router-default-5444994796-n5rln\" (UID: \"6578bd1a-eaad-452a-adf5-7f3e34838677\") " pod="openshift-ingress/router-default-5444994796-n5rln" Oct 02 13:01:21 crc kubenswrapper[4724]: I1002 13:01:21.002164 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/2b666d74-c6b0-4909-83af-2b736c0e032a-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-w4dqz\" (UID: \"2b666d74-c6b0-4909-83af-2b736c0e032a\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-w4dqz" Oct 02 13:01:21 crc kubenswrapper[4724]: I1002 13:01:21.003887 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/152155f3-933d-43c5-abeb-7c06899a6939-trusted-ca\") pod \"console-operator-58897d9998-v94bt\" (UID: \"152155f3-933d-43c5-abeb-7c06899a6939\") " pod="openshift-console-operator/console-operator-58897d9998-v94bt" Oct 02 13:01:21 crc kubenswrapper[4724]: I1002 13:01:21.003993 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5a3d5592-2d16-4d75-a734-664f6dd16418-serving-cert\") pod \"etcd-operator-b45778765-9fscb\" (UID: \"5a3d5592-2d16-4d75-a734-664f6dd16418\") " pod="openshift-etcd-operator/etcd-operator-b45778765-9fscb" Oct 02 13:01:21 crc kubenswrapper[4724]: I1002 13:01:21.004285 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/6578bd1a-eaad-452a-adf5-7f3e34838677-default-certificate\") pod \"router-default-5444994796-n5rln\" (UID: \"6578bd1a-eaad-452a-adf5-7f3e34838677\") " pod="openshift-ingress/router-default-5444994796-n5rln" Oct 02 13:01:21 crc kubenswrapper[4724]: I1002 13:01:21.004336 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f5bb856b-60df-44d0-9979-906fc271f66e-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-qwb55\" (UID: \"f5bb856b-60df-44d0-9979-906fc271f66e\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-qwb55" Oct 02 13:01:21 crc kubenswrapper[4724]: I1002 13:01:21.004865 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/2b330608-20dc-445e-bf75-4393541c7fd4-metrics-tls\") pod \"ingress-operator-5b745b69d9-hdbqt\" (UID: \"2b330608-20dc-445e-bf75-4393541c7fd4\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-hdbqt" Oct 02 13:01:21 crc kubenswrapper[4724]: I1002 13:01:21.005194 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ac67382d-e26c-48c3-933e-19fecd4d5d49-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-jpfxn\" (UID: \"ac67382d-e26c-48c3-933e-19fecd4d5d49\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-jpfxn" Oct 02 13:01:21 crc kubenswrapper[4724]: I1002 13:01:21.005340 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f0cb7f77-88ee-46c4-9c81-aa953416aec1-srv-cert\") pod \"olm-operator-6b444d44fb-d2rqx\" (UID: \"f0cb7f77-88ee-46c4-9c81-aa953416aec1\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-d2rqx" Oct 02 13:01:21 crc kubenswrapper[4724]: I1002 13:01:21.006155 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1d28f605-0e15-46b5-9c71-b6a123c0e0ce-config\") pod \"kube-controller-manager-operator-78b949d7b-fh5fg\" (UID: \"1d28f605-0e15-46b5-9c71-b6a123c0e0ce\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-fh5fg" Oct 02 13:01:21 crc kubenswrapper[4724]: I1002 13:01:21.006226 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6ctbt\" (UniqueName: \"kubernetes.io/projected/aa3fb3e7-b85a-4ba1-a075-4d3ae0c6eed6-kube-api-access-6ctbt\") pod \"machine-config-server-kw2qg\" (UID: \"aa3fb3e7-b85a-4ba1-a075-4d3ae0c6eed6\") " pod="openshift-machine-config-operator/machine-config-server-kw2qg" Oct 02 13:01:21 crc kubenswrapper[4724]: I1002 13:01:21.006260 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/98449ccf-cf29-44ab-9400-994b04309bb5-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-j7cp6\" (UID: \"98449ccf-cf29-44ab-9400-994b04309bb5\") " pod="openshift-marketplace/marketplace-operator-79b997595-j7cp6" Oct 02 13:01:21 crc kubenswrapper[4724]: I1002 13:01:21.006308 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/a939e0cc-650f-4fb4-9a13-bcbf29ebdb76-signing-cabundle\") pod \"service-ca-9c57cc56f-4rwfr\" (UID: \"a939e0cc-650f-4fb4-9a13-bcbf29ebdb76\") " pod="openshift-service-ca/service-ca-9c57cc56f-4rwfr" Oct 02 13:01:21 crc kubenswrapper[4724]: I1002 13:01:21.006324 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a9c024ff-b6a9-4b92-8c1d-debc51c10ec9-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-t8hf4\" (UID: \"a9c024ff-b6a9-4b92-8c1d-debc51c10ec9\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-t8hf4" Oct 02 13:01:21 crc kubenswrapper[4724]: I1002 13:01:21.006362 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6jvc4\" (UniqueName: \"kubernetes.io/projected/131c7969-a8c8-4cfc-b655-ac3d400fae1b-kube-api-access-6jvc4\") pod \"console-f9d7485db-lvb24\" (UID: \"131c7969-a8c8-4cfc-b655-ac3d400fae1b\") " pod="openshift-console/console-f9d7485db-lvb24" Oct 02 13:01:21 crc kubenswrapper[4724]: I1002 13:01:21.006393 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lfsx8\" (UniqueName: \"kubernetes.io/projected/6578bd1a-eaad-452a-adf5-7f3e34838677-kube-api-access-lfsx8\") pod \"router-default-5444994796-n5rln\" (UID: \"6578bd1a-eaad-452a-adf5-7f3e34838677\") " pod="openshift-ingress/router-default-5444994796-n5rln" Oct 02 13:01:21 crc kubenswrapper[4724]: I1002 13:01:21.006459 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dvth8\" (UniqueName: \"kubernetes.io/projected/5a3d5592-2d16-4d75-a734-664f6dd16418-kube-api-access-dvth8\") pod \"etcd-operator-b45778765-9fscb\" (UID: \"5a3d5592-2d16-4d75-a734-664f6dd16418\") " pod="openshift-etcd-operator/etcd-operator-b45778765-9fscb" Oct 02 13:01:21 crc kubenswrapper[4724]: I1002 13:01:21.006498 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/7de6408f-76b4-4a9a-bf83-fe6c4e60848e-registry-certificates\") pod \"image-registry-697d97f7c8-v5w5d\" (UID: \"7de6408f-76b4-4a9a-bf83-fe6c4e60848e\") " pod="openshift-image-registry/image-registry-697d97f7c8-v5w5d" Oct 02 13:01:21 crc kubenswrapper[4724]: I1002 13:01:21.006522 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/152155f3-933d-43c5-abeb-7c06899a6939-config\") pod \"console-operator-58897d9998-v94bt\" (UID: \"152155f3-933d-43c5-abeb-7c06899a6939\") " pod="openshift-console-operator/console-operator-58897d9998-v94bt" Oct 02 13:01:21 crc kubenswrapper[4724]: I1002 13:01:21.006569 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5a3d5592-2d16-4d75-a734-664f6dd16418-config\") pod \"etcd-operator-b45778765-9fscb\" (UID: \"5a3d5592-2d16-4d75-a734-664f6dd16418\") " pod="openshift-etcd-operator/etcd-operator-b45778765-9fscb" Oct 02 13:01:21 crc kubenswrapper[4724]: I1002 13:01:21.006595 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/0fe37aaa-cc86-416c-9718-97b43f158977-webhook-cert\") pod \"packageserver-d55dfcdfc-m6knf\" (UID: \"0fe37aaa-cc86-416c-9718-97b43f158977\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-m6knf" Oct 02 13:01:21 crc kubenswrapper[4724]: I1002 13:01:21.006668 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/152155f3-933d-43c5-abeb-7c06899a6939-serving-cert\") pod \"console-operator-58897d9998-v94bt\" (UID: \"152155f3-933d-43c5-abeb-7c06899a6939\") " pod="openshift-console-operator/console-operator-58897d9998-v94bt" Oct 02 13:01:21 crc kubenswrapper[4724]: I1002 13:01:21.007207 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/131c7969-a8c8-4cfc-b655-ac3d400fae1b-oauth-serving-cert\") pod \"console-f9d7485db-lvb24\" (UID: \"131c7969-a8c8-4cfc-b655-ac3d400fae1b\") " pod="openshift-console/console-f9d7485db-lvb24" Oct 02 13:01:21 crc kubenswrapper[4724]: I1002 13:01:21.007818 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5111b397-09d5-4412-9135-2ea4914c00db-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-dlnq8\" (UID: \"5111b397-09d5-4412-9135-2ea4914c00db\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-dlnq8" Oct 02 13:01:21 crc kubenswrapper[4724]: I1002 13:01:21.008233 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/7de6408f-76b4-4a9a-bf83-fe6c4e60848e-registry-certificates\") pod \"image-registry-697d97f7c8-v5w5d\" (UID: \"7de6408f-76b4-4a9a-bf83-fe6c4e60848e\") " pod="openshift-image-registry/image-registry-697d97f7c8-v5w5d" Oct 02 13:01:21 crc kubenswrapper[4724]: I1002 13:01:21.008313 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a9c024ff-b6a9-4b92-8c1d-debc51c10ec9-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-t8hf4\" (UID: \"a9c024ff-b6a9-4b92-8c1d-debc51c10ec9\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-t8hf4" Oct 02 13:01:21 crc kubenswrapper[4724]: I1002 13:01:21.008335 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/152155f3-933d-43c5-abeb-7c06899a6939-config\") pod \"console-operator-58897d9998-v94bt\" (UID: \"152155f3-933d-43c5-abeb-7c06899a6939\") " pod="openshift-console-operator/console-operator-58897d9998-v94bt" Oct 02 13:01:21 crc kubenswrapper[4724]: I1002 13:01:21.008626 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/42570dd4-bfb7-43a7-9f10-bf0df9236925-profile-collector-cert\") pod \"catalog-operator-68c6474976-8zmbp\" (UID: \"42570dd4-bfb7-43a7-9f10-bf0df9236925\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-8zmbp" Oct 02 13:01:21 crc kubenswrapper[4724]: I1002 13:01:21.008930 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5a3d5592-2d16-4d75-a734-664f6dd16418-config\") pod \"etcd-operator-b45778765-9fscb\" (UID: \"5a3d5592-2d16-4d75-a734-664f6dd16418\") " pod="openshift-etcd-operator/etcd-operator-b45778765-9fscb" Oct 02 13:01:21 crc kubenswrapper[4724]: I1002 13:01:21.010193 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/42570dd4-bfb7-43a7-9f10-bf0df9236925-srv-cert\") pod \"catalog-operator-68c6474976-8zmbp\" (UID: \"42570dd4-bfb7-43a7-9f10-bf0df9236925\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-8zmbp" Oct 02 13:01:21 crc kubenswrapper[4724]: I1002 13:01:21.010225 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6578bd1a-eaad-452a-adf5-7f3e34838677-metrics-certs\") pod \"router-default-5444994796-n5rln\" (UID: \"6578bd1a-eaad-452a-adf5-7f3e34838677\") " pod="openshift-ingress/router-default-5444994796-n5rln" Oct 02 13:01:21 crc kubenswrapper[4724]: I1002 13:01:21.010749 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/6578bd1a-eaad-452a-adf5-7f3e34838677-stats-auth\") pod \"router-default-5444994796-n5rln\" (UID: \"6578bd1a-eaad-452a-adf5-7f3e34838677\") " pod="openshift-ingress/router-default-5444994796-n5rln" Oct 02 13:01:21 crc kubenswrapper[4724]: I1002 13:01:21.011606 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/5a3d5592-2d16-4d75-a734-664f6dd16418-etcd-client\") pod \"etcd-operator-b45778765-9fscb\" (UID: \"5a3d5592-2d16-4d75-a734-664f6dd16418\") " pod="openshift-etcd-operator/etcd-operator-b45778765-9fscb" Oct 02 13:01:21 crc kubenswrapper[4724]: I1002 13:01:21.012011 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/7d87bc9c-fb3d-42c1-94c9-ec34e34d0f16-proxy-tls\") pod \"machine-config-operator-74547568cd-vg57j\" (UID: \"7d87bc9c-fb3d-42c1-94c9-ec34e34d0f16\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-vg57j" Oct 02 13:01:21 crc kubenswrapper[4724]: I1002 13:01:21.013367 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/131c7969-a8c8-4cfc-b655-ac3d400fae1b-console-serving-cert\") pod \"console-f9d7485db-lvb24\" (UID: \"131c7969-a8c8-4cfc-b655-ac3d400fae1b\") " pod="openshift-console/console-f9d7485db-lvb24" Oct 02 13:01:21 crc kubenswrapper[4724]: I1002 13:01:21.014130 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/152155f3-933d-43c5-abeb-7c06899a6939-serving-cert\") pod \"console-operator-58897d9998-v94bt\" (UID: \"152155f3-933d-43c5-abeb-7c06899a6939\") " pod="openshift-console-operator/console-operator-58897d9998-v94bt" Oct 02 13:01:21 crc kubenswrapper[4724]: I1002 13:01:21.014884 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f0cb7f77-88ee-46c4-9c81-aa953416aec1-profile-collector-cert\") pod \"olm-operator-6b444d44fb-d2rqx\" (UID: \"f0cb7f77-88ee-46c4-9c81-aa953416aec1\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-d2rqx" Oct 02 13:01:21 crc kubenswrapper[4724]: I1002 13:01:21.015187 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1d28f605-0e15-46b5-9c71-b6a123c0e0ce-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-fh5fg\" (UID: \"1d28f605-0e15-46b5-9c71-b6a123c0e0ce\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-fh5fg" Oct 02 13:01:21 crc kubenswrapper[4724]: I1002 13:01:21.027719 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/7de6408f-76b4-4a9a-bf83-fe6c4e60848e-bound-sa-token\") pod \"image-registry-697d97f7c8-v5w5d\" (UID: \"7de6408f-76b4-4a9a-bf83-fe6c4e60848e\") " pod="openshift-image-registry/image-registry-697d97f7c8-v5w5d" Oct 02 13:01:21 crc kubenswrapper[4724]: I1002 13:01:21.039459 4724 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-m8t6v"] Oct 02 13:01:21 crc kubenswrapper[4724]: I1002 13:01:21.043172 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mrtn7\" (UniqueName: \"kubernetes.io/projected/21e8cad3-fa39-40e9-9d04-ff4dd12c3ec9-kube-api-access-mrtn7\") pod \"migrator-59844c95c7-f2jql\" (UID: \"21e8cad3-fa39-40e9-9d04-ff4dd12c3ec9\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-f2jql" Oct 02 13:01:21 crc kubenswrapper[4724]: I1002 13:01:21.055098 4724 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-kl4xt"] Oct 02 13:01:21 crc kubenswrapper[4724]: I1002 13:01:21.064309 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dthqw\" (UniqueName: \"kubernetes.io/projected/2b330608-20dc-445e-bf75-4393541c7fd4-kube-api-access-dthqw\") pod \"ingress-operator-5b745b69d9-hdbqt\" (UID: \"2b330608-20dc-445e-bf75-4393541c7fd4\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-hdbqt" Oct 02 13:01:21 crc kubenswrapper[4724]: I1002 13:01:21.083787 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5l7td\" (UniqueName: \"kubernetes.io/projected/42570dd4-bfb7-43a7-9f10-bf0df9236925-kube-api-access-5l7td\") pod \"catalog-operator-68c6474976-8zmbp\" (UID: \"42570dd4-bfb7-43a7-9f10-bf0df9236925\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-8zmbp" Oct 02 13:01:21 crc kubenswrapper[4724]: I1002 13:01:21.093917 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-zpmh7" event={"ID":"9699ff16-3d72-4ba6-9055-6b707c3e223f","Type":"ContainerStarted","Data":"c9d96b73240d61675e5a2d0c17104172029912f815cb06339ec4a7dc1012e8bf"} Oct 02 13:01:21 crc kubenswrapper[4724]: I1002 13:01:21.093986 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-zpmh7" event={"ID":"9699ff16-3d72-4ba6-9055-6b707c3e223f","Type":"ContainerStarted","Data":"a74d4dce04f0ae9b58c7b830d83ad2914b2ce61a3f7091f8cc6396c4dd547327"} Oct 02 13:01:21 crc kubenswrapper[4724]: I1002 13:01:21.094931 4724 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-879f6c89f-zpmh7" Oct 02 13:01:21 crc kubenswrapper[4724]: I1002 13:01:21.096402 4724 generic.go:334] "Generic (PLEG): container finished" podID="da7051e3-8a79-43e7-9016-9d492b51a9fd" containerID="b801b82839927222b6f0c6699ba1680a13aa41c5ca7d0335e3b866c5145eaa0e" exitCode=0 Oct 02 13:01:21 crc kubenswrapper[4724]: I1002 13:01:21.096461 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-xjrx9" event={"ID":"da7051e3-8a79-43e7-9016-9d492b51a9fd","Type":"ContainerDied","Data":"b801b82839927222b6f0c6699ba1680a13aa41c5ca7d0335e3b866c5145eaa0e"} Oct 02 13:01:21 crc kubenswrapper[4724]: I1002 13:01:21.096481 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-xjrx9" event={"ID":"da7051e3-8a79-43e7-9016-9d492b51a9fd","Type":"ContainerStarted","Data":"661397798fbe3b346e839375cc79e96be88333af1570c6feb716342b18827818"} Oct 02 13:01:21 crc kubenswrapper[4724]: I1002 13:01:21.099281 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-x726v" event={"ID":"27c39377-9fc1-4ca9-8ce4-8a1c61f181c0","Type":"ContainerStarted","Data":"99adeef7549c29406bf967d448d39ea335b97d393413fafb9a5e6c477d7fbe37"} Oct 02 13:01:21 crc kubenswrapper[4724]: I1002 13:01:21.099348 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-x726v" event={"ID":"27c39377-9fc1-4ca9-8ce4-8a1c61f181c0","Type":"ContainerStarted","Data":"b877606837961514759f762aafbc3917577cd415cb6b736b5ca7ee99b11eb54f"} Oct 02 13:01:21 crc kubenswrapper[4724]: I1002 13:01:21.102114 4724 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-zpmh7 container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.17:8443/healthz\": dial tcp 10.217.0.17:8443: connect: connection refused" start-of-body= Oct 02 13:01:21 crc kubenswrapper[4724]: I1002 13:01:21.102167 4724 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-zpmh7" podUID="9699ff16-3d72-4ba6-9055-6b707c3e223f" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.17:8443/healthz\": dial tcp 10.217.0.17:8443: connect: connection refused" Oct 02 13:01:21 crc kubenswrapper[4724]: I1002 13:01:21.102373 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-st27v" event={"ID":"e0f09ca7-146f-44c3-9137-76cff079b5bc","Type":"ContainerStarted","Data":"d225839b9858b7e329a6ccb9f44a53909b161238b79e17aa83696a5bc3976a5a"} Oct 02 13:01:21 crc kubenswrapper[4724]: I1002 13:01:21.102415 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-st27v" event={"ID":"e0f09ca7-146f-44c3-9137-76cff079b5bc","Type":"ContainerStarted","Data":"5f11f3ffac03c9b6aae7003a768af0406d7d4b2195af2b869abf6520e77c448b"} Oct 02 13:01:21 crc kubenswrapper[4724]: I1002 13:01:21.105349 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7hq8j\" (UniqueName: \"kubernetes.io/projected/7d87bc9c-fb3d-42c1-94c9-ec34e34d0f16-kube-api-access-7hq8j\") pod \"machine-config-operator-74547568cd-vg57j\" (UID: \"7d87bc9c-fb3d-42c1-94c9-ec34e34d0f16\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-vg57j" Oct 02 13:01:21 crc kubenswrapper[4724]: I1002 13:01:21.106079 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-974gc" event={"ID":"4f4b9ef0-2fa1-4d48-81c9-e428e93c7034","Type":"ContainerStarted","Data":"a30232449d01583ffaaca47e6d49b6d2d70df344fa2e723c212820229d215813"} Oct 02 13:01:21 crc kubenswrapper[4724]: I1002 13:01:21.106115 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-974gc" event={"ID":"4f4b9ef0-2fa1-4d48-81c9-e428e93c7034","Type":"ContainerStarted","Data":"63576a310ab08a14df7c569a40489303ba0dc83f136a44cdce6b16484923f9e7"} Oct 02 13:01:21 crc kubenswrapper[4724]: I1002 13:01:21.107223 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 13:01:21 crc kubenswrapper[4724]: E1002 13:01:21.107396 4724 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 13:01:21.607374034 +0000 UTC m=+146.062133155 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 13:01:21 crc kubenswrapper[4724]: I1002 13:01:21.107601 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-blhd5\" (UniqueName: \"kubernetes.io/projected/a38ec3fe-c0db-4d6d-94a0-f4c5dfbc33c1-kube-api-access-blhd5\") pod \"package-server-manager-789f6589d5-qbbfj\" (UID: \"a38ec3fe-c0db-4d6d-94a0-f4c5dfbc33c1\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-qbbfj" Oct 02 13:01:21 crc kubenswrapper[4724]: I1002 13:01:21.107631 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/59bd4d63-57bf-4b08-b7f1-f0c7d733e571-secret-volume\") pod \"collect-profiles-29323500-vz2xt\" (UID: \"59bd4d63-57bf-4b08-b7f1-f0c7d733e571\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323500-vz2xt" Oct 02 13:01:21 crc kubenswrapper[4724]: I1002 13:01:21.107651 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jhddp\" (UniqueName: \"kubernetes.io/projected/324d6aa8-e27f-49ab-9b8e-c72665cc2f34-kube-api-access-jhddp\") pod \"ingress-canary-lqm4v\" (UID: \"324d6aa8-e27f-49ab-9b8e-c72665cc2f34\") " pod="openshift-ingress-canary/ingress-canary-lqm4v" Oct 02 13:01:21 crc kubenswrapper[4724]: I1002 13:01:21.107716 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rp2gb\" (UniqueName: \"kubernetes.io/projected/89edffbe-3576-4a6e-8bd6-884ee6fbc58d-kube-api-access-rp2gb\") pod \"kube-storage-version-migrator-operator-b67b599dd-z6hlf\" (UID: \"89edffbe-3576-4a6e-8bd6-884ee6fbc58d\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-z6hlf" Oct 02 13:01:21 crc kubenswrapper[4724]: I1002 13:01:21.108979 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/c8812b3c-bd75-49a8-b2a7-3db91675fc09-registration-dir\") pod \"csi-hostpathplugin-cwvv5\" (UID: \"c8812b3c-bd75-49a8-b2a7-3db91675fc09\") " pod="hostpath-provisioner/csi-hostpathplugin-cwvv5" Oct 02 13:01:21 crc kubenswrapper[4724]: I1002 13:01:21.109085 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/8c9cecbe-09ba-4a38-a541-f897b225f416-proxy-tls\") pod \"machine-config-controller-84d6567774-5mxg5\" (UID: \"8c9cecbe-09ba-4a38-a541-f897b225f416\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-5mxg5" Oct 02 13:01:21 crc kubenswrapper[4724]: I1002 13:01:21.109126 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/c8812b3c-bd75-49a8-b2a7-3db91675fc09-mountpoint-dir\") pod \"csi-hostpathplugin-cwvv5\" (UID: \"c8812b3c-bd75-49a8-b2a7-3db91675fc09\") " pod="hostpath-provisioner/csi-hostpathplugin-cwvv5" Oct 02 13:01:21 crc kubenswrapper[4724]: I1002 13:01:21.109129 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-zl9kr" event={"ID":"486adec0-0d45-4294-ac1b-5fff0dda6602","Type":"ContainerStarted","Data":"a280b8177e0d8c331b6da10bf6d0660b6ff34b2f0d3be7c514fc6d25e3f27207"} Oct 02 13:01:21 crc kubenswrapper[4724]: I1002 13:01:21.109155 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/324d6aa8-e27f-49ab-9b8e-c72665cc2f34-cert\") pod \"ingress-canary-lqm4v\" (UID: \"324d6aa8-e27f-49ab-9b8e-c72665cc2f34\") " pod="openshift-ingress-canary/ingress-canary-lqm4v" Oct 02 13:01:21 crc kubenswrapper[4724]: I1002 13:01:21.109161 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-zl9kr" event={"ID":"486adec0-0d45-4294-ac1b-5fff0dda6602","Type":"ContainerStarted","Data":"527c7d77789e4877acc20367919fd404e582314766281f21e202217b1e5c53db"} Oct 02 13:01:21 crc kubenswrapper[4724]: I1002 13:01:21.109183 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b75eedff-1c9f-456b-800e-6eebbf0db535-serving-cert\") pod \"service-ca-operator-777779d784-klq6x\" (UID: \"b75eedff-1c9f-456b-800e-6eebbf0db535\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-klq6x" Oct 02 13:01:21 crc kubenswrapper[4724]: I1002 13:01:21.109211 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pnzrh\" (UniqueName: \"kubernetes.io/projected/8c9cecbe-09ba-4a38-a541-f897b225f416-kube-api-access-pnzrh\") pod \"machine-config-controller-84d6567774-5mxg5\" (UID: \"8c9cecbe-09ba-4a38-a541-f897b225f416\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-5mxg5" Oct 02 13:01:21 crc kubenswrapper[4724]: I1002 13:01:21.109237 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/89edffbe-3576-4a6e-8bd6-884ee6fbc58d-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-z6hlf\" (UID: \"89edffbe-3576-4a6e-8bd6-884ee6fbc58d\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-z6hlf" Oct 02 13:01:21 crc kubenswrapper[4724]: I1002 13:01:21.109242 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/c8812b3c-bd75-49a8-b2a7-3db91675fc09-mountpoint-dir\") pod \"csi-hostpathplugin-cwvv5\" (UID: \"c8812b3c-bd75-49a8-b2a7-3db91675fc09\") " pod="hostpath-provisioner/csi-hostpathplugin-cwvv5" Oct 02 13:01:21 crc kubenswrapper[4724]: I1002 13:01:21.109258 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/a939e0cc-650f-4fb4-9a13-bcbf29ebdb76-signing-key\") pod \"service-ca-9c57cc56f-4rwfr\" (UID: \"a939e0cc-650f-4fb4-9a13-bcbf29ebdb76\") " pod="openshift-service-ca/service-ca-9c57cc56f-4rwfr" Oct 02 13:01:21 crc kubenswrapper[4724]: I1002 13:01:21.109296 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/89edffbe-3576-4a6e-8bd6-884ee6fbc58d-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-z6hlf\" (UID: \"89edffbe-3576-4a6e-8bd6-884ee6fbc58d\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-z6hlf" Oct 02 13:01:21 crc kubenswrapper[4724]: I1002 13:01:21.109356 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ghc4r\" (UniqueName: \"kubernetes.io/projected/efa1155d-cd1a-496d-94e8-eecbee129061-kube-api-access-ghc4r\") pod \"control-plane-machine-set-operator-78cbb6b69f-wjrm6\" (UID: \"efa1155d-cd1a-496d-94e8-eecbee129061\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-wjrm6" Oct 02 13:01:21 crc kubenswrapper[4724]: I1002 13:01:21.109380 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/4e183eb5-2230-43b3-b8b8-b1c3aaa21370-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-xnt99\" (UID: \"4e183eb5-2230-43b3-b8b8-b1c3aaa21370\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-xnt99" Oct 02 13:01:21 crc kubenswrapper[4724]: I1002 13:01:21.109412 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/c8812b3c-bd75-49a8-b2a7-3db91675fc09-csi-data-dir\") pod \"csi-hostpathplugin-cwvv5\" (UID: \"c8812b3c-bd75-49a8-b2a7-3db91675fc09\") " pod="hostpath-provisioner/csi-hostpathplugin-cwvv5" Oct 02 13:01:21 crc kubenswrapper[4724]: I1002 13:01:21.109438 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hpktr\" (UniqueName: \"kubernetes.io/projected/4e183eb5-2230-43b3-b8b8-b1c3aaa21370-kube-api-access-hpktr\") pod \"multus-admission-controller-857f4d67dd-xnt99\" (UID: \"4e183eb5-2230-43b3-b8b8-b1c3aaa21370\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-xnt99" Oct 02 13:01:21 crc kubenswrapper[4724]: I1002 13:01:21.109470 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/8fa4c4f2-9475-482e-b428-c2ec0abc2842-config-volume\") pod \"dns-default-tq9w6\" (UID: \"8fa4c4f2-9475-482e-b428-c2ec0abc2842\") " pod="openshift-dns/dns-default-tq9w6" Oct 02 13:01:21 crc kubenswrapper[4724]: I1002 13:01:21.109483 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/c8812b3c-bd75-49a8-b2a7-3db91675fc09-registration-dir\") pod \"csi-hostpathplugin-cwvv5\" (UID: \"c8812b3c-bd75-49a8-b2a7-3db91675fc09\") " pod="hostpath-provisioner/csi-hostpathplugin-cwvv5" Oct 02 13:01:21 crc kubenswrapper[4724]: I1002 13:01:21.109506 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pd6zv\" (UniqueName: \"kubernetes.io/projected/59bd4d63-57bf-4b08-b7f1-f0c7d733e571-kube-api-access-pd6zv\") pod \"collect-profiles-29323500-vz2xt\" (UID: \"59bd4d63-57bf-4b08-b7f1-f0c7d733e571\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323500-vz2xt" Oct 02 13:01:21 crc kubenswrapper[4724]: I1002 13:01:21.109568 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f56s4\" (UniqueName: \"kubernetes.io/projected/8fa4c4f2-9475-482e-b428-c2ec0abc2842-kube-api-access-f56s4\") pod \"dns-default-tq9w6\" (UID: \"8fa4c4f2-9475-482e-b428-c2ec0abc2842\") " pod="openshift-dns/dns-default-tq9w6" Oct 02 13:01:21 crc kubenswrapper[4724]: I1002 13:01:21.109597 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/98449ccf-cf29-44ab-9400-994b04309bb5-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-j7cp6\" (UID: \"98449ccf-cf29-44ab-9400-994b04309bb5\") " pod="openshift-marketplace/marketplace-operator-79b997595-j7cp6" Oct 02 13:01:21 crc kubenswrapper[4724]: I1002 13:01:21.109637 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/8c9cecbe-09ba-4a38-a541-f897b225f416-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-5mxg5\" (UID: \"8c9cecbe-09ba-4a38-a541-f897b225f416\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-5mxg5" Oct 02 13:01:21 crc kubenswrapper[4724]: I1002 13:01:21.109664 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/0fe37aaa-cc86-416c-9718-97b43f158977-tmpfs\") pod \"packageserver-d55dfcdfc-m6knf\" (UID: \"0fe37aaa-cc86-416c-9718-97b43f158977\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-m6knf" Oct 02 13:01:21 crc kubenswrapper[4724]: I1002 13:01:21.109691 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/aa3fb3e7-b85a-4ba1-a075-4d3ae0c6eed6-node-bootstrap-token\") pod \"machine-config-server-kw2qg\" (UID: \"aa3fb3e7-b85a-4ba1-a075-4d3ae0c6eed6\") " pod="openshift-machine-config-operator/machine-config-server-kw2qg" Oct 02 13:01:21 crc kubenswrapper[4724]: I1002 13:01:21.110147 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5dlhl\" (UniqueName: \"kubernetes.io/projected/98449ccf-cf29-44ab-9400-994b04309bb5-kube-api-access-5dlhl\") pod \"marketplace-operator-79b997595-j7cp6\" (UID: \"98449ccf-cf29-44ab-9400-994b04309bb5\") " pod="openshift-marketplace/marketplace-operator-79b997595-j7cp6" Oct 02 13:01:21 crc kubenswrapper[4724]: I1002 13:01:21.110182 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/a38ec3fe-c0db-4d6d-94a0-f4c5dfbc33c1-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-qbbfj\" (UID: \"a38ec3fe-c0db-4d6d-94a0-f4c5dfbc33c1\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-qbbfj" Oct 02 13:01:21 crc kubenswrapper[4724]: I1002 13:01:21.110201 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/aa3fb3e7-b85a-4ba1-a075-4d3ae0c6eed6-certs\") pod \"machine-config-server-kw2qg\" (UID: \"aa3fb3e7-b85a-4ba1-a075-4d3ae0c6eed6\") " pod="openshift-machine-config-operator/machine-config-server-kw2qg" Oct 02 13:01:21 crc kubenswrapper[4724]: I1002 13:01:21.110228 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sjs4f\" (UniqueName: \"kubernetes.io/projected/a939e0cc-650f-4fb4-9a13-bcbf29ebdb76-kube-api-access-sjs4f\") pod \"service-ca-9c57cc56f-4rwfr\" (UID: \"a939e0cc-650f-4fb4-9a13-bcbf29ebdb76\") " pod="openshift-service-ca/service-ca-9c57cc56f-4rwfr" Oct 02 13:01:21 crc kubenswrapper[4724]: I1002 13:01:21.110246 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/8fa4c4f2-9475-482e-b428-c2ec0abc2842-metrics-tls\") pod \"dns-default-tq9w6\" (UID: \"8fa4c4f2-9475-482e-b428-c2ec0abc2842\") " pod="openshift-dns/dns-default-tq9w6" Oct 02 13:01:21 crc kubenswrapper[4724]: I1002 13:01:21.110265 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/c8812b3c-bd75-49a8-b2a7-3db91675fc09-plugins-dir\") pod \"csi-hostpathplugin-cwvv5\" (UID: \"c8812b3c-bd75-49a8-b2a7-3db91675fc09\") " pod="hostpath-provisioner/csi-hostpathplugin-cwvv5" Oct 02 13:01:21 crc kubenswrapper[4724]: I1002 13:01:21.110297 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/0fe37aaa-cc86-416c-9718-97b43f158977-apiservice-cert\") pod \"packageserver-d55dfcdfc-m6knf\" (UID: \"0fe37aaa-cc86-416c-9718-97b43f158977\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-m6knf" Oct 02 13:01:21 crc kubenswrapper[4724]: I1002 13:01:21.110319 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6j7th\" (UniqueName: \"kubernetes.io/projected/0fe37aaa-cc86-416c-9718-97b43f158977-kube-api-access-6j7th\") pod \"packageserver-d55dfcdfc-m6knf\" (UID: \"0fe37aaa-cc86-416c-9718-97b43f158977\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-m6knf" Oct 02 13:01:21 crc kubenswrapper[4724]: I1002 13:01:21.110334 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/c8812b3c-bd75-49a8-b2a7-3db91675fc09-socket-dir\") pod \"csi-hostpathplugin-cwvv5\" (UID: \"c8812b3c-bd75-49a8-b2a7-3db91675fc09\") " pod="hostpath-provisioner/csi-hostpathplugin-cwvv5" Oct 02 13:01:21 crc kubenswrapper[4724]: I1002 13:01:21.110355 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-v5w5d\" (UID: \"7de6408f-76b4-4a9a-bf83-fe6c4e60848e\") " pod="openshift-image-registry/image-registry-697d97f7c8-v5w5d" Oct 02 13:01:21 crc kubenswrapper[4724]: I1002 13:01:21.110383 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/efa1155d-cd1a-496d-94e8-eecbee129061-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-wjrm6\" (UID: \"efa1155d-cd1a-496d-94e8-eecbee129061\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-wjrm6" Oct 02 13:01:21 crc kubenswrapper[4724]: I1002 13:01:21.110402 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b75eedff-1c9f-456b-800e-6eebbf0db535-config\") pod \"service-ca-operator-777779d784-klq6x\" (UID: \"b75eedff-1c9f-456b-800e-6eebbf0db535\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-klq6x" Oct 02 13:01:21 crc kubenswrapper[4724]: I1002 13:01:21.110425 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6ctbt\" (UniqueName: \"kubernetes.io/projected/aa3fb3e7-b85a-4ba1-a075-4d3ae0c6eed6-kube-api-access-6ctbt\") pod \"machine-config-server-kw2qg\" (UID: \"aa3fb3e7-b85a-4ba1-a075-4d3ae0c6eed6\") " pod="openshift-machine-config-operator/machine-config-server-kw2qg" Oct 02 13:01:21 crc kubenswrapper[4724]: I1002 13:01:21.110442 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/98449ccf-cf29-44ab-9400-994b04309bb5-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-j7cp6\" (UID: \"98449ccf-cf29-44ab-9400-994b04309bb5\") " pod="openshift-marketplace/marketplace-operator-79b997595-j7cp6" Oct 02 13:01:21 crc kubenswrapper[4724]: I1002 13:01:21.110465 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/a939e0cc-650f-4fb4-9a13-bcbf29ebdb76-signing-cabundle\") pod \"service-ca-9c57cc56f-4rwfr\" (UID: \"a939e0cc-650f-4fb4-9a13-bcbf29ebdb76\") " pod="openshift-service-ca/service-ca-9c57cc56f-4rwfr" Oct 02 13:01:21 crc kubenswrapper[4724]: I1002 13:01:21.110507 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/0fe37aaa-cc86-416c-9718-97b43f158977-webhook-cert\") pod \"packageserver-d55dfcdfc-m6knf\" (UID: \"0fe37aaa-cc86-416c-9718-97b43f158977\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-m6knf" Oct 02 13:01:21 crc kubenswrapper[4724]: I1002 13:01:21.110526 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/59bd4d63-57bf-4b08-b7f1-f0c7d733e571-config-volume\") pod \"collect-profiles-29323500-vz2xt\" (UID: \"59bd4d63-57bf-4b08-b7f1-f0c7d733e571\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323500-vz2xt" Oct 02 13:01:21 crc kubenswrapper[4724]: I1002 13:01:21.110564 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g8xvl\" (UniqueName: \"kubernetes.io/projected/b75eedff-1c9f-456b-800e-6eebbf0db535-kube-api-access-g8xvl\") pod \"service-ca-operator-777779d784-klq6x\" (UID: \"b75eedff-1c9f-456b-800e-6eebbf0db535\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-klq6x" Oct 02 13:01:21 crc kubenswrapper[4724]: I1002 13:01:21.110594 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7nnlj\" (UniqueName: \"kubernetes.io/projected/c8812b3c-bd75-49a8-b2a7-3db91675fc09-kube-api-access-7nnlj\") pod \"csi-hostpathplugin-cwvv5\" (UID: \"c8812b3c-bd75-49a8-b2a7-3db91675fc09\") " pod="hostpath-provisioner/csi-hostpathplugin-cwvv5" Oct 02 13:01:21 crc kubenswrapper[4724]: I1002 13:01:21.110948 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/59bd4d63-57bf-4b08-b7f1-f0c7d733e571-secret-volume\") pod \"collect-profiles-29323500-vz2xt\" (UID: \"59bd4d63-57bf-4b08-b7f1-f0c7d733e571\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323500-vz2xt" Oct 02 13:01:21 crc kubenswrapper[4724]: I1002 13:01:21.111157 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/c8812b3c-bd75-49a8-b2a7-3db91675fc09-plugins-dir\") pod \"csi-hostpathplugin-cwvv5\" (UID: \"c8812b3c-bd75-49a8-b2a7-3db91675fc09\") " pod="hostpath-provisioner/csi-hostpathplugin-cwvv5" Oct 02 13:01:21 crc kubenswrapper[4724]: E1002 13:01:21.111302 4724 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 13:01:21.611277063 +0000 UTC m=+146.066036184 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-v5w5d" (UID: "7de6408f-76b4-4a9a-bf83-fe6c4e60848e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 13:01:21 crc kubenswrapper[4724]: I1002 13:01:21.111169 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/89edffbe-3576-4a6e-8bd6-884ee6fbc58d-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-z6hlf\" (UID: \"89edffbe-3576-4a6e-8bd6-884ee6fbc58d\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-z6hlf" Oct 02 13:01:21 crc kubenswrapper[4724]: I1002 13:01:21.112164 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/c8812b3c-bd75-49a8-b2a7-3db91675fc09-socket-dir\") pod \"csi-hostpathplugin-cwvv5\" (UID: \"c8812b3c-bd75-49a8-b2a7-3db91675fc09\") " pod="hostpath-provisioner/csi-hostpathplugin-cwvv5" Oct 02 13:01:21 crc kubenswrapper[4724]: I1002 13:01:21.112895 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/c8812b3c-bd75-49a8-b2a7-3db91675fc09-csi-data-dir\") pod \"csi-hostpathplugin-cwvv5\" (UID: \"c8812b3c-bd75-49a8-b2a7-3db91675fc09\") " pod="hostpath-provisioner/csi-hostpathplugin-cwvv5" Oct 02 13:01:21 crc kubenswrapper[4724]: I1002 13:01:21.113387 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b75eedff-1c9f-456b-800e-6eebbf0db535-config\") pod \"service-ca-operator-777779d784-klq6x\" (UID: \"b75eedff-1c9f-456b-800e-6eebbf0db535\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-klq6x" Oct 02 13:01:21 crc kubenswrapper[4724]: I1002 13:01:21.113907 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/89edffbe-3576-4a6e-8bd6-884ee6fbc58d-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-z6hlf\" (UID: \"89edffbe-3576-4a6e-8bd6-884ee6fbc58d\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-z6hlf" Oct 02 13:01:21 crc kubenswrapper[4724]: I1002 13:01:21.114015 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/0fe37aaa-cc86-416c-9718-97b43f158977-tmpfs\") pod \"packageserver-d55dfcdfc-m6knf\" (UID: \"0fe37aaa-cc86-416c-9718-97b43f158977\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-m6knf" Oct 02 13:01:21 crc kubenswrapper[4724]: I1002 13:01:21.114290 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/98449ccf-cf29-44ab-9400-994b04309bb5-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-j7cp6\" (UID: \"98449ccf-cf29-44ab-9400-994b04309bb5\") " pod="openshift-marketplace/marketplace-operator-79b997595-j7cp6" Oct 02 13:01:21 crc kubenswrapper[4724]: I1002 13:01:21.114311 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/59bd4d63-57bf-4b08-b7f1-f0c7d733e571-config-volume\") pod \"collect-profiles-29323500-vz2xt\" (UID: \"59bd4d63-57bf-4b08-b7f1-f0c7d733e571\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323500-vz2xt" Oct 02 13:01:21 crc kubenswrapper[4724]: I1002 13:01:21.114442 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/8c9cecbe-09ba-4a38-a541-f897b225f416-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-5mxg5\" (UID: \"8c9cecbe-09ba-4a38-a541-f897b225f416\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-5mxg5" Oct 02 13:01:21 crc kubenswrapper[4724]: I1002 13:01:21.115010 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/8fa4c4f2-9475-482e-b428-c2ec0abc2842-config-volume\") pod \"dns-default-tq9w6\" (UID: \"8fa4c4f2-9475-482e-b428-c2ec0abc2842\") " pod="openshift-dns/dns-default-tq9w6" Oct 02 13:01:21 crc kubenswrapper[4724]: I1002 13:01:21.115463 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b75eedff-1c9f-456b-800e-6eebbf0db535-serving-cert\") pod \"service-ca-operator-777779d784-klq6x\" (UID: \"b75eedff-1c9f-456b-800e-6eebbf0db535\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-klq6x" Oct 02 13:01:21 crc kubenswrapper[4724]: I1002 13:01:21.115842 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/8c9cecbe-09ba-4a38-a541-f897b225f416-proxy-tls\") pod \"machine-config-controller-84d6567774-5mxg5\" (UID: \"8c9cecbe-09ba-4a38-a541-f897b225f416\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-5mxg5" Oct 02 13:01:21 crc kubenswrapper[4724]: I1002 13:01:21.116548 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/a939e0cc-650f-4fb4-9a13-bcbf29ebdb76-signing-key\") pod \"service-ca-9c57cc56f-4rwfr\" (UID: \"a939e0cc-650f-4fb4-9a13-bcbf29ebdb76\") " pod="openshift-service-ca/service-ca-9c57cc56f-4rwfr" Oct 02 13:01:21 crc kubenswrapper[4724]: I1002 13:01:21.116794 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/324d6aa8-e27f-49ab-9b8e-c72665cc2f34-cert\") pod \"ingress-canary-lqm4v\" (UID: \"324d6aa8-e27f-49ab-9b8e-c72665cc2f34\") " pod="openshift-ingress-canary/ingress-canary-lqm4v" Oct 02 13:01:21 crc kubenswrapper[4724]: I1002 13:01:21.116885 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/aa3fb3e7-b85a-4ba1-a075-4d3ae0c6eed6-node-bootstrap-token\") pod \"machine-config-server-kw2qg\" (UID: \"aa3fb3e7-b85a-4ba1-a075-4d3ae0c6eed6\") " pod="openshift-machine-config-operator/machine-config-server-kw2qg" Oct 02 13:01:21 crc kubenswrapper[4724]: I1002 13:01:21.116990 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/8fa4c4f2-9475-482e-b428-c2ec0abc2842-metrics-tls\") pod \"dns-default-tq9w6\" (UID: \"8fa4c4f2-9475-482e-b428-c2ec0abc2842\") " pod="openshift-dns/dns-default-tq9w6" Oct 02 13:01:21 crc kubenswrapper[4724]: I1002 13:01:21.117250 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-qxfvd" event={"ID":"bbeff7ab-5d85-4709-bd85-22d6e99ff30c","Type":"ContainerStarted","Data":"6c10394862d8ad4453ef8d8842c922c6a1dab042fc868311c73f30c3fa99c95b"} Oct 02 13:01:21 crc kubenswrapper[4724]: I1002 13:01:21.117303 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-qxfvd" event={"ID":"bbeff7ab-5d85-4709-bd85-22d6e99ff30c","Type":"ContainerStarted","Data":"b8fbc49c6c25c04d4d5d4465d02273a374657c359202a7297bee89248aa7aac4"} Oct 02 13:01:21 crc kubenswrapper[4724]: I1002 13:01:21.117549 4724 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-qxfvd" Oct 02 13:01:21 crc kubenswrapper[4724]: I1002 13:01:21.117624 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/0fe37aaa-cc86-416c-9718-97b43f158977-apiservice-cert\") pod \"packageserver-d55dfcdfc-m6knf\" (UID: \"0fe37aaa-cc86-416c-9718-97b43f158977\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-m6knf" Oct 02 13:01:21 crc kubenswrapper[4724]: I1002 13:01:21.117681 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/4e183eb5-2230-43b3-b8b8-b1c3aaa21370-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-xnt99\" (UID: \"4e183eb5-2230-43b3-b8b8-b1c3aaa21370\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-xnt99" Oct 02 13:01:21 crc kubenswrapper[4724]: I1002 13:01:21.117974 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/a38ec3fe-c0db-4d6d-94a0-f4c5dfbc33c1-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-qbbfj\" (UID: \"a38ec3fe-c0db-4d6d-94a0-f4c5dfbc33c1\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-qbbfj" Oct 02 13:01:21 crc kubenswrapper[4724]: I1002 13:01:21.120518 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/0fe37aaa-cc86-416c-9718-97b43f158977-webhook-cert\") pod \"packageserver-d55dfcdfc-m6knf\" (UID: \"0fe37aaa-cc86-416c-9718-97b43f158977\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-m6knf" Oct 02 13:01:21 crc kubenswrapper[4724]: I1002 13:01:21.120587 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/efa1155d-cd1a-496d-94e8-eecbee129061-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-wjrm6\" (UID: \"efa1155d-cd1a-496d-94e8-eecbee129061\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-wjrm6" Oct 02 13:01:21 crc kubenswrapper[4724]: I1002 13:01:21.121377 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/98449ccf-cf29-44ab-9400-994b04309bb5-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-j7cp6\" (UID: \"98449ccf-cf29-44ab-9400-994b04309bb5\") " pod="openshift-marketplace/marketplace-operator-79b997595-j7cp6" Oct 02 13:01:21 crc kubenswrapper[4724]: I1002 13:01:21.121457 4724 patch_prober.go:28] interesting pod/route-controller-manager-6576b87f9c-qxfvd container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.9:8443/healthz\": dial tcp 10.217.0.9:8443: connect: connection refused" start-of-body= Oct 02 13:01:21 crc kubenswrapper[4724]: I1002 13:01:21.121502 4724 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-qxfvd" podUID="bbeff7ab-5d85-4709-bd85-22d6e99ff30c" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.9:8443/healthz\": dial tcp 10.217.0.9:8443: connect: connection refused" Oct 02 13:01:21 crc kubenswrapper[4724]: I1002 13:01:21.121668 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/a939e0cc-650f-4fb4-9a13-bcbf29ebdb76-signing-cabundle\") pod \"service-ca-9c57cc56f-4rwfr\" (UID: \"a939e0cc-650f-4fb4-9a13-bcbf29ebdb76\") " pod="openshift-service-ca/service-ca-9c57cc56f-4rwfr" Oct 02 13:01:21 crc kubenswrapper[4724]: I1002 13:01:21.121802 4724 generic.go:334] "Generic (PLEG): container finished" podID="4501d69c-964b-4444-b8af-d56b9301a685" containerID="c9789b98c98062cbd849f502ecd672ea36c4581e520d84fb86f311017dffb2ce" exitCode=0 Oct 02 13:01:21 crc kubenswrapper[4724]: I1002 13:01:21.121858 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-n4xw8" event={"ID":"4501d69c-964b-4444-b8af-d56b9301a685","Type":"ContainerDied","Data":"c9789b98c98062cbd849f502ecd672ea36c4581e520d84fb86f311017dffb2ce"} Oct 02 13:01:21 crc kubenswrapper[4724]: I1002 13:01:21.121888 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-n4xw8" event={"ID":"4501d69c-964b-4444-b8af-d56b9301a685","Type":"ContainerStarted","Data":"b1b0cd9e3fa1c32b5017d4239eded774916d049644505b35eaad30d3a3e9896c"} Oct 02 13:01:21 crc kubenswrapper[4724]: I1002 13:01:21.127580 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/secret/aa3fb3e7-b85a-4ba1-a075-4d3ae0c6eed6-certs\") pod \"machine-config-server-kw2qg\" (UID: \"aa3fb3e7-b85a-4ba1-a075-4d3ae0c6eed6\") " pod="openshift-machine-config-operator/machine-config-server-kw2qg" Oct 02 13:01:21 crc kubenswrapper[4724]: I1002 13:01:21.132873 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4wjlh\" (UniqueName: \"kubernetes.io/projected/f0cb7f77-88ee-46c4-9c81-aa953416aec1-kube-api-access-4wjlh\") pod \"olm-operator-6b444d44fb-d2rqx\" (UID: \"f0cb7f77-88ee-46c4-9c81-aa953416aec1\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-d2rqx" Oct 02 13:01:21 crc kubenswrapper[4724]: I1002 13:01:21.145042 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a9c024ff-b6a9-4b92-8c1d-debc51c10ec9-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-t8hf4\" (UID: \"a9c024ff-b6a9-4b92-8c1d-debc51c10ec9\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-t8hf4" Oct 02 13:01:21 crc kubenswrapper[4724]: I1002 13:01:21.168201 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/59e1178f-be06-4966-9e77-031df1e58c1a-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-tj5h8\" (UID: \"59e1178f-be06-4966-9e77-031df1e58c1a\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-tj5h8" Oct 02 13:01:21 crc kubenswrapper[4724]: I1002 13:01:21.184046 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-85b8l\" (UniqueName: \"kubernetes.io/projected/499ba1de-99e3-4a0c-be96-866d0127402d-kube-api-access-85b8l\") pod \"dns-operator-744455d44c-4lzm2\" (UID: \"499ba1de-99e3-4a0c-be96-866d0127402d\") " pod="openshift-dns-operator/dns-operator-744455d44c-4lzm2" Oct 02 13:01:21 crc kubenswrapper[4724]: I1002 13:01:21.198366 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-t8hf4" Oct 02 13:01:21 crc kubenswrapper[4724]: I1002 13:01:21.211440 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 13:01:21 crc kubenswrapper[4724]: E1002 13:01:21.211718 4724 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 13:01:21.711685622 +0000 UTC m=+146.166444753 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 13:01:21 crc kubenswrapper[4724]: I1002 13:01:21.212179 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-v5w5d\" (UID: \"7de6408f-76b4-4a9a-bf83-fe6c4e60848e\") " pod="openshift-image-registry/image-registry-697d97f7c8-v5w5d" Oct 02 13:01:21 crc kubenswrapper[4724]: E1002 13:01:21.213134 4724 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 13:01:21.713113389 +0000 UTC m=+146.167872700 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-v5w5d" (UID: "7de6408f-76b4-4a9a-bf83-fe6c4e60848e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 13:01:21 crc kubenswrapper[4724]: I1002 13:01:21.217794 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mxxw8\" (UniqueName: \"kubernetes.io/projected/2b666d74-c6b0-4909-83af-2b736c0e032a-kube-api-access-mxxw8\") pod \"cluster-samples-operator-665b6dd947-w4dqz\" (UID: \"2b666d74-c6b0-4909-83af-2b736c0e032a\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-w4dqz" Oct 02 13:01:21 crc kubenswrapper[4724]: I1002 13:01:21.232038 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-4lzm2" Oct 02 13:01:21 crc kubenswrapper[4724]: I1002 13:01:21.235419 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/2b330608-20dc-445e-bf75-4393541c7fd4-bound-sa-token\") pod \"ingress-operator-5b745b69d9-hdbqt\" (UID: \"2b330608-20dc-445e-bf75-4393541c7fd4\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-hdbqt" Oct 02 13:01:21 crc kubenswrapper[4724]: I1002 13:01:21.246082 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zpxzz\" (UniqueName: \"kubernetes.io/projected/152155f3-933d-43c5-abeb-7c06899a6939-kube-api-access-zpxzz\") pod \"console-operator-58897d9998-v94bt\" (UID: \"152155f3-933d-43c5-abeb-7c06899a6939\") " pod="openshift-console-operator/console-operator-58897d9998-v94bt" Oct 02 13:01:21 crc kubenswrapper[4724]: I1002 13:01:21.255836 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-8zmbp" Oct 02 13:01:21 crc kubenswrapper[4724]: I1002 13:01:21.256182 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-hdbqt" Oct 02 13:01:21 crc kubenswrapper[4724]: I1002 13:01:21.264113 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-vg57j" Oct 02 13:01:21 crc kubenswrapper[4724]: I1002 13:01:21.266954 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rn974\" (UniqueName: \"kubernetes.io/projected/ac67382d-e26c-48c3-933e-19fecd4d5d49-kube-api-access-rn974\") pod \"openshift-apiserver-operator-796bbdcf4f-jpfxn\" (UID: \"ac67382d-e26c-48c3-933e-19fecd4d5d49\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-jpfxn" Oct 02 13:01:21 crc kubenswrapper[4724]: I1002 13:01:21.272913 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-f2jql" Oct 02 13:01:21 crc kubenswrapper[4724]: I1002 13:01:21.277110 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-d2rqx" Oct 02 13:01:21 crc kubenswrapper[4724]: I1002 13:01:21.286093 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qckhw\" (UniqueName: \"kubernetes.io/projected/59e1178f-be06-4966-9e77-031df1e58c1a-kube-api-access-qckhw\") pod \"cluster-image-registry-operator-dc59b4c8b-tj5h8\" (UID: \"59e1178f-be06-4966-9e77-031df1e58c1a\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-tj5h8" Oct 02 13:01:21 crc kubenswrapper[4724]: I1002 13:01:21.305431 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1d28f605-0e15-46b5-9c71-b6a123c0e0ce-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-fh5fg\" (UID: \"1d28f605-0e15-46b5-9c71-b6a123c0e0ce\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-fh5fg" Oct 02 13:01:21 crc kubenswrapper[4724]: I1002 13:01:21.314150 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 13:01:21 crc kubenswrapper[4724]: E1002 13:01:21.314763 4724 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 13:01:21.814654357 +0000 UTC m=+146.269413478 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 13:01:21 crc kubenswrapper[4724]: I1002 13:01:21.314954 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-v5w5d\" (UID: \"7de6408f-76b4-4a9a-bf83-fe6c4e60848e\") " pod="openshift-image-registry/image-registry-697d97f7c8-v5w5d" Oct 02 13:01:21 crc kubenswrapper[4724]: E1002 13:01:21.315597 4724 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 13:01:21.815585471 +0000 UTC m=+146.270344592 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-v5w5d" (UID: "7de6408f-76b4-4a9a-bf83-fe6c4e60848e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 13:01:21 crc kubenswrapper[4724]: I1002 13:01:21.327810 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bm2h7\" (UniqueName: \"kubernetes.io/projected/7de6408f-76b4-4a9a-bf83-fe6c4e60848e-kube-api-access-bm2h7\") pod \"image-registry-697d97f7c8-v5w5d\" (UID: \"7de6408f-76b4-4a9a-bf83-fe6c4e60848e\") " pod="openshift-image-registry/image-registry-697d97f7c8-v5w5d" Oct 02 13:01:21 crc kubenswrapper[4724]: I1002 13:01:21.345123 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/f5bb856b-60df-44d0-9979-906fc271f66e-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-qwb55\" (UID: \"f5bb856b-60df-44d0-9979-906fc271f66e\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-qwb55" Oct 02 13:01:21 crc kubenswrapper[4724]: I1002 13:01:21.372807 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-29qtp\" (UniqueName: \"kubernetes.io/projected/5111b397-09d5-4412-9135-2ea4914c00db-kube-api-access-29qtp\") pod \"openshift-controller-manager-operator-756b6f6bc6-dlnq8\" (UID: \"5111b397-09d5-4412-9135-2ea4914c00db\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-dlnq8" Oct 02 13:01:21 crc kubenswrapper[4724]: I1002 13:01:21.387405 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lfsx8\" (UniqueName: \"kubernetes.io/projected/6578bd1a-eaad-452a-adf5-7f3e34838677-kube-api-access-lfsx8\") pod \"router-default-5444994796-n5rln\" (UID: \"6578bd1a-eaad-452a-adf5-7f3e34838677\") " pod="openshift-ingress/router-default-5444994796-n5rln" Oct 02 13:01:21 crc kubenswrapper[4724]: I1002 13:01:21.402400 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6jvc4\" (UniqueName: \"kubernetes.io/projected/131c7969-a8c8-4cfc-b655-ac3d400fae1b-kube-api-access-6jvc4\") pod \"console-f9d7485db-lvb24\" (UID: \"131c7969-a8c8-4cfc-b655-ac3d400fae1b\") " pod="openshift-console/console-f9d7485db-lvb24" Oct 02 13:01:21 crc kubenswrapper[4724]: I1002 13:01:21.407416 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-w4dqz" Oct 02 13:01:21 crc kubenswrapper[4724]: I1002 13:01:21.416756 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 13:01:21 crc kubenswrapper[4724]: E1002 13:01:21.417295 4724 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 13:01:21.917276094 +0000 UTC m=+146.372035215 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 13:01:21 crc kubenswrapper[4724]: I1002 13:01:21.426785 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-dlnq8" Oct 02 13:01:21 crc kubenswrapper[4724]: I1002 13:01:21.427884 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dvth8\" (UniqueName: \"kubernetes.io/projected/5a3d5592-2d16-4d75-a734-664f6dd16418-kube-api-access-dvth8\") pod \"etcd-operator-b45778765-9fscb\" (UID: \"5a3d5592-2d16-4d75-a734-664f6dd16418\") " pod="openshift-etcd-operator/etcd-operator-b45778765-9fscb" Oct 02 13:01:21 crc kubenswrapper[4724]: I1002 13:01:21.444218 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-tj5h8" Oct 02 13:01:21 crc kubenswrapper[4724]: I1002 13:01:21.467343 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-v94bt" Oct 02 13:01:21 crc kubenswrapper[4724]: I1002 13:01:21.474955 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-jpfxn" Oct 02 13:01:21 crc kubenswrapper[4724]: I1002 13:01:21.484103 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-lvb24" Oct 02 13:01:21 crc kubenswrapper[4724]: I1002 13:01:21.484669 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-blhd5\" (UniqueName: \"kubernetes.io/projected/a38ec3fe-c0db-4d6d-94a0-f4c5dfbc33c1-kube-api-access-blhd5\") pod \"package-server-manager-789f6589d5-qbbfj\" (UID: \"a38ec3fe-c0db-4d6d-94a0-f4c5dfbc33c1\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-qbbfj" Oct 02 13:01:21 crc kubenswrapper[4724]: I1002 13:01:21.505737 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-qwb55" Oct 02 13:01:21 crc kubenswrapper[4724]: I1002 13:01:21.518875 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-v5w5d\" (UID: \"7de6408f-76b4-4a9a-bf83-fe6c4e60848e\") " pod="openshift-image-registry/image-registry-697d97f7c8-v5w5d" Oct 02 13:01:21 crc kubenswrapper[4724]: I1002 13:01:21.518964 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jhddp\" (UniqueName: \"kubernetes.io/projected/324d6aa8-e27f-49ab-9b8e-c72665cc2f34-kube-api-access-jhddp\") pod \"ingress-canary-lqm4v\" (UID: \"324d6aa8-e27f-49ab-9b8e-c72665cc2f34\") " pod="openshift-ingress-canary/ingress-canary-lqm4v" Oct 02 13:01:21 crc kubenswrapper[4724]: E1002 13:01:21.519264 4724 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 13:01:22.019247944 +0000 UTC m=+146.474007065 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-v5w5d" (UID: "7de6408f-76b4-4a9a-bf83-fe6c4e60848e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 13:01:21 crc kubenswrapper[4724]: I1002 13:01:21.519759 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-9fscb" Oct 02 13:01:21 crc kubenswrapper[4724]: I1002 13:01:21.525437 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rp2gb\" (UniqueName: \"kubernetes.io/projected/89edffbe-3576-4a6e-8bd6-884ee6fbc58d-kube-api-access-rp2gb\") pod \"kube-storage-version-migrator-operator-b67b599dd-z6hlf\" (UID: \"89edffbe-3576-4a6e-8bd6-884ee6fbc58d\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-z6hlf" Oct 02 13:01:21 crc kubenswrapper[4724]: I1002 13:01:21.534154 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ghc4r\" (UniqueName: \"kubernetes.io/projected/efa1155d-cd1a-496d-94e8-eecbee129061-kube-api-access-ghc4r\") pod \"control-plane-machine-set-operator-78cbb6b69f-wjrm6\" (UID: \"efa1155d-cd1a-496d-94e8-eecbee129061\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-wjrm6" Oct 02 13:01:21 crc kubenswrapper[4724]: I1002 13:01:21.538134 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-fh5fg" Oct 02 13:01:21 crc kubenswrapper[4724]: I1002 13:01:21.543243 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-n5rln" Oct 02 13:01:21 crc kubenswrapper[4724]: I1002 13:01:21.543265 4724 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-t8hf4"] Oct 02 13:01:21 crc kubenswrapper[4724]: I1002 13:01:21.548418 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pnzrh\" (UniqueName: \"kubernetes.io/projected/8c9cecbe-09ba-4a38-a541-f897b225f416-kube-api-access-pnzrh\") pod \"machine-config-controller-84d6567774-5mxg5\" (UID: \"8c9cecbe-09ba-4a38-a541-f897b225f416\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-5mxg5" Oct 02 13:01:21 crc kubenswrapper[4724]: I1002 13:01:21.563846 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7nnlj\" (UniqueName: \"kubernetes.io/projected/c8812b3c-bd75-49a8-b2a7-3db91675fc09-kube-api-access-7nnlj\") pod \"csi-hostpathplugin-cwvv5\" (UID: \"c8812b3c-bd75-49a8-b2a7-3db91675fc09\") " pod="hostpath-provisioner/csi-hostpathplugin-cwvv5" Oct 02 13:01:21 crc kubenswrapper[4724]: I1002 13:01:21.619606 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-wjrm6" Oct 02 13:01:21 crc kubenswrapper[4724]: I1002 13:01:21.623460 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 13:01:21 crc kubenswrapper[4724]: E1002 13:01:21.623767 4724 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 13:01:22.123742977 +0000 UTC m=+146.578502098 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 13:01:21 crc kubenswrapper[4724]: I1002 13:01:21.623897 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-v5w5d\" (UID: \"7de6408f-76b4-4a9a-bf83-fe6c4e60848e\") " pod="openshift-image-registry/image-registry-697d97f7c8-v5w5d" Oct 02 13:01:21 crc kubenswrapper[4724]: E1002 13:01:21.624497 4724 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 13:01:22.124475456 +0000 UTC m=+146.579234577 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-v5w5d" (UID: "7de6408f-76b4-4a9a-bf83-fe6c4e60848e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 13:01:21 crc kubenswrapper[4724]: I1002 13:01:21.633064 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-qbbfj" Oct 02 13:01:21 crc kubenswrapper[4724]: I1002 13:01:21.633256 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5dlhl\" (UniqueName: \"kubernetes.io/projected/98449ccf-cf29-44ab-9400-994b04309bb5-kube-api-access-5dlhl\") pod \"marketplace-operator-79b997595-j7cp6\" (UID: \"98449ccf-cf29-44ab-9400-994b04309bb5\") " pod="openshift-marketplace/marketplace-operator-79b997595-j7cp6" Oct 02 13:01:21 crc kubenswrapper[4724]: I1002 13:01:21.638943 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-5mxg5" Oct 02 13:01:21 crc kubenswrapper[4724]: I1002 13:01:21.640315 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6j7th\" (UniqueName: \"kubernetes.io/projected/0fe37aaa-cc86-416c-9718-97b43f158977-kube-api-access-6j7th\") pod \"packageserver-d55dfcdfc-m6knf\" (UID: \"0fe37aaa-cc86-416c-9718-97b43f158977\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-m6knf" Oct 02 13:01:21 crc kubenswrapper[4724]: I1002 13:01:21.654335 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sjs4f\" (UniqueName: \"kubernetes.io/projected/a939e0cc-650f-4fb4-9a13-bcbf29ebdb76-kube-api-access-sjs4f\") pod \"service-ca-9c57cc56f-4rwfr\" (UID: \"a939e0cc-650f-4fb4-9a13-bcbf29ebdb76\") " pod="openshift-service-ca/service-ca-9c57cc56f-4rwfr" Oct 02 13:01:21 crc kubenswrapper[4724]: I1002 13:01:21.654841 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-4rwfr" Oct 02 13:01:21 crc kubenswrapper[4724]: I1002 13:01:21.660166 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pd6zv\" (UniqueName: \"kubernetes.io/projected/59bd4d63-57bf-4b08-b7f1-f0c7d733e571-kube-api-access-pd6zv\") pod \"collect-profiles-29323500-vz2xt\" (UID: \"59bd4d63-57bf-4b08-b7f1-f0c7d733e571\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323500-vz2xt" Oct 02 13:01:21 crc kubenswrapper[4724]: I1002 13:01:21.661676 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-z6hlf" Oct 02 13:01:21 crc kubenswrapper[4724]: I1002 13:01:21.684593 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-cwvv5" Oct 02 13:01:21 crc kubenswrapper[4724]: I1002 13:01:21.688015 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6ctbt\" (UniqueName: \"kubernetes.io/projected/aa3fb3e7-b85a-4ba1-a075-4d3ae0c6eed6-kube-api-access-6ctbt\") pod \"machine-config-server-kw2qg\" (UID: \"aa3fb3e7-b85a-4ba1-a075-4d3ae0c6eed6\") " pod="openshift-machine-config-operator/machine-config-server-kw2qg" Oct 02 13:01:21 crc kubenswrapper[4724]: I1002 13:01:21.690606 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f56s4\" (UniqueName: \"kubernetes.io/projected/8fa4c4f2-9475-482e-b428-c2ec0abc2842-kube-api-access-f56s4\") pod \"dns-default-tq9w6\" (UID: \"8fa4c4f2-9475-482e-b428-c2ec0abc2842\") " pod="openshift-dns/dns-default-tq9w6" Oct 02 13:01:21 crc kubenswrapper[4724]: I1002 13:01:21.692149 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-lqm4v" Oct 02 13:01:21 crc kubenswrapper[4724]: I1002 13:01:21.699189 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-kw2qg" Oct 02 13:01:21 crc kubenswrapper[4724]: I1002 13:01:21.706920 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hpktr\" (UniqueName: \"kubernetes.io/projected/4e183eb5-2230-43b3-b8b8-b1c3aaa21370-kube-api-access-hpktr\") pod \"multus-admission-controller-857f4d67dd-xnt99\" (UID: \"4e183eb5-2230-43b3-b8b8-b1c3aaa21370\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-xnt99" Oct 02 13:01:21 crc kubenswrapper[4724]: I1002 13:01:21.724739 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 13:01:21 crc kubenswrapper[4724]: E1002 13:01:21.725387 4724 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 13:01:22.225358078 +0000 UTC m=+146.680117199 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 13:01:21 crc kubenswrapper[4724]: I1002 13:01:21.733366 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g8xvl\" (UniqueName: \"kubernetes.io/projected/b75eedff-1c9f-456b-800e-6eebbf0db535-kube-api-access-g8xvl\") pod \"service-ca-operator-777779d784-klq6x\" (UID: \"b75eedff-1c9f-456b-800e-6eebbf0db535\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-klq6x" Oct 02 13:01:21 crc kubenswrapper[4724]: W1002 13:01:21.736959 4724 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6578bd1a_eaad_452a_adf5_7f3e34838677.slice/crio-5d9a8e0d3e295e25918a4215d5ff9aef942a11b7038bdac7de72b550c115a589 WatchSource:0}: Error finding container 5d9a8e0d3e295e25918a4215d5ff9aef942a11b7038bdac7de72b550c115a589: Status 404 returned error can't find the container with id 5d9a8e0d3e295e25918a4215d5ff9aef942a11b7038bdac7de72b550c115a589 Oct 02 13:01:21 crc kubenswrapper[4724]: I1002 13:01:21.828519 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-v5w5d\" (UID: \"7de6408f-76b4-4a9a-bf83-fe6c4e60848e\") " pod="openshift-image-registry/image-registry-697d97f7c8-v5w5d" Oct 02 13:01:21 crc kubenswrapper[4724]: E1002 13:01:21.828954 4724 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 13:01:22.328937709 +0000 UTC m=+146.783696830 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-v5w5d" (UID: "7de6408f-76b4-4a9a-bf83-fe6c4e60848e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 13:01:21 crc kubenswrapper[4724]: I1002 13:01:21.831997 4724 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-4lzm2"] Oct 02 13:01:21 crc kubenswrapper[4724]: I1002 13:01:21.885337 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-xnt99" Oct 02 13:01:21 crc kubenswrapper[4724]: I1002 13:01:21.917050 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-m6knf" Oct 02 13:01:21 crc kubenswrapper[4724]: I1002 13:01:21.917179 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29323500-vz2xt" Oct 02 13:01:21 crc kubenswrapper[4724]: I1002 13:01:21.917095 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-klq6x" Oct 02 13:01:21 crc kubenswrapper[4724]: I1002 13:01:21.934295 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-j7cp6" Oct 02 13:01:21 crc kubenswrapper[4724]: I1002 13:01:21.935439 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 13:01:21 crc kubenswrapper[4724]: E1002 13:01:21.935986 4724 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 13:01:22.435961166 +0000 UTC m=+146.890720287 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 13:01:21 crc kubenswrapper[4724]: I1002 13:01:21.947188 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-tq9w6" Oct 02 13:01:21 crc kubenswrapper[4724]: I1002 13:01:21.977400 4724 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-d2rqx"] Oct 02 13:01:21 crc kubenswrapper[4724]: I1002 13:01:21.991971 4724 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-vg57j"] Oct 02 13:01:22 crc kubenswrapper[4724]: I1002 13:01:22.003998 4724 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-f2jql"] Oct 02 13:01:22 crc kubenswrapper[4724]: I1002 13:01:22.040586 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-v5w5d\" (UID: \"7de6408f-76b4-4a9a-bf83-fe6c4e60848e\") " pod="openshift-image-registry/image-registry-697d97f7c8-v5w5d" Oct 02 13:01:22 crc kubenswrapper[4724]: E1002 13:01:22.042099 4724 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 13:01:22.542082551 +0000 UTC m=+146.996841672 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-v5w5d" (UID: "7de6408f-76b4-4a9a-bf83-fe6c4e60848e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 13:01:22 crc kubenswrapper[4724]: I1002 13:01:22.087841 4724 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-8zmbp"] Oct 02 13:01:22 crc kubenswrapper[4724]: I1002 13:01:22.142388 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 13:01:22 crc kubenswrapper[4724]: E1002 13:01:22.143513 4724 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 13:01:22.643487237 +0000 UTC m=+147.098246358 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 13:01:22 crc kubenswrapper[4724]: I1002 13:01:22.149779 4724 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-hdbqt"] Oct 02 13:01:22 crc kubenswrapper[4724]: I1002 13:01:22.159273 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-zl9kr" event={"ID":"486adec0-0d45-4294-ac1b-5fff0dda6602","Type":"ContainerStarted","Data":"facab2719271aaad00b8f83fc7b3f45aaf46b03abfd83a48f78864679dfe8291"} Oct 02 13:01:22 crc kubenswrapper[4724]: I1002 13:01:22.168296 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-4lzm2" event={"ID":"499ba1de-99e3-4a0c-be96-866d0127402d","Type":"ContainerStarted","Data":"d76d0ec4057371a4f1bfb6a4d6b74a9ae974edd03d1848b3c3ccc34ce8767450"} Oct 02 13:01:22 crc kubenswrapper[4724]: I1002 13:01:22.175994 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-n5rln" event={"ID":"6578bd1a-eaad-452a-adf5-7f3e34838677","Type":"ContainerStarted","Data":"5d9a8e0d3e295e25918a4215d5ff9aef942a11b7038bdac7de72b550c115a589"} Oct 02 13:01:22 crc kubenswrapper[4724]: I1002 13:01:22.177616 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-m8t6v" event={"ID":"7aab6527-d135-45a0-8fe0-99de1fd40d3d","Type":"ContainerStarted","Data":"d1197afbf4f6871bdff3016acbbd4ddbf867dd3e3e9231c31dc5b8888cc765a4"} Oct 02 13:01:22 crc kubenswrapper[4724]: I1002 13:01:22.177803 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-m8t6v" event={"ID":"7aab6527-d135-45a0-8fe0-99de1fd40d3d","Type":"ContainerStarted","Data":"155b2ba8d9ceb546e3f310fa236b3111ccb53c91a4470cb248147158f27b9aef"} Oct 02 13:01:22 crc kubenswrapper[4724]: I1002 13:01:22.179581 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-kl4xt" event={"ID":"84a3f9d9-8a1f-45ed-ad6f-3c8eb02738e7","Type":"ContainerStarted","Data":"1f16c7644b51cadbb9df0f1c385a7db415ba695ce93a476baa53667049828ae5"} Oct 02 13:01:22 crc kubenswrapper[4724]: I1002 13:01:22.181402 4724 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/downloads-7954f5f757-kl4xt" Oct 02 13:01:22 crc kubenswrapper[4724]: I1002 13:01:22.181510 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-kl4xt" event={"ID":"84a3f9d9-8a1f-45ed-ad6f-3c8eb02738e7","Type":"ContainerStarted","Data":"84ec4f65f3c5b4722877a8d04dea83b89d4761e7a68d78bab34241d3afdebe94"} Oct 02 13:01:22 crc kubenswrapper[4724]: I1002 13:01:22.184054 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-xjrx9" event={"ID":"da7051e3-8a79-43e7-9016-9d492b51a9fd","Type":"ContainerStarted","Data":"e16f021cb76f96807fe922c701a13dcb7679a47c9cac20d1834dbb7064b58900"} Oct 02 13:01:22 crc kubenswrapper[4724]: I1002 13:01:22.186583 4724 generic.go:334] "Generic (PLEG): container finished" podID="4f4b9ef0-2fa1-4d48-81c9-e428e93c7034" containerID="a30232449d01583ffaaca47e6d49b6d2d70df344fa2e723c212820229d215813" exitCode=0 Oct 02 13:01:22 crc kubenswrapper[4724]: I1002 13:01:22.186769 4724 patch_prober.go:28] interesting pod/downloads-7954f5f757-kl4xt container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.12:8080/\": dial tcp 10.217.0.12:8080: connect: connection refused" start-of-body= Oct 02 13:01:22 crc kubenswrapper[4724]: I1002 13:01:22.186828 4724 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-kl4xt" podUID="84a3f9d9-8a1f-45ed-ad6f-3c8eb02738e7" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.12:8080/\": dial tcp 10.217.0.12:8080: connect: connection refused" Oct 02 13:01:22 crc kubenswrapper[4724]: I1002 13:01:22.187080 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-974gc" event={"ID":"4f4b9ef0-2fa1-4d48-81c9-e428e93c7034","Type":"ContainerDied","Data":"a30232449d01583ffaaca47e6d49b6d2d70df344fa2e723c212820229d215813"} Oct 02 13:01:22 crc kubenswrapper[4724]: I1002 13:01:22.189131 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-kw2qg" event={"ID":"aa3fb3e7-b85a-4ba1-a075-4d3ae0c6eed6","Type":"ContainerStarted","Data":"ee35249a4c40805113beab387ac342ac1e10fee5d566a67bb84bc949b5e73c9a"} Oct 02 13:01:22 crc kubenswrapper[4724]: I1002 13:01:22.193491 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-x726v" event={"ID":"27c39377-9fc1-4ca9-8ce4-8a1c61f181c0","Type":"ContainerStarted","Data":"3446c9fbafd8a0642edc671c45180de3d23c3837e89bffbf5cd9d57071c5d088"} Oct 02 13:01:22 crc kubenswrapper[4724]: I1002 13:01:22.195609 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-t8hf4" event={"ID":"a9c024ff-b6a9-4b92-8c1d-debc51c10ec9","Type":"ContainerStarted","Data":"a01421d11e8991a90ec5cfa6c226e86891aef561447822d19ae2302e7e61c79e"} Oct 02 13:01:22 crc kubenswrapper[4724]: I1002 13:01:22.201456 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-n4xw8" event={"ID":"4501d69c-964b-4444-b8af-d56b9301a685","Type":"ContainerStarted","Data":"76f910b589fae2c6f08ffc99fd7465c9f68fb7786c22ad73cb129c245df7c7b2"} Oct 02 13:01:22 crc kubenswrapper[4724]: I1002 13:01:22.202384 4724 patch_prober.go:28] interesting pod/route-controller-manager-6576b87f9c-qxfvd container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.9:8443/healthz\": dial tcp 10.217.0.9:8443: connect: connection refused" start-of-body= Oct 02 13:01:22 crc kubenswrapper[4724]: I1002 13:01:22.202418 4724 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-qxfvd" podUID="bbeff7ab-5d85-4709-bd85-22d6e99ff30c" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.9:8443/healthz\": dial tcp 10.217.0.9:8443: connect: connection refused" Oct 02 13:01:22 crc kubenswrapper[4724]: I1002 13:01:22.202839 4724 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-zpmh7 container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.17:8443/healthz\": dial tcp 10.217.0.17:8443: connect: connection refused" start-of-body= Oct 02 13:01:22 crc kubenswrapper[4724]: I1002 13:01:22.202913 4724 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-zpmh7" podUID="9699ff16-3d72-4ba6-9055-6b707c3e223f" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.17:8443/healthz\": dial tcp 10.217.0.17:8443: connect: connection refused" Oct 02 13:01:22 crc kubenswrapper[4724]: I1002 13:01:22.210515 4724 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-w4dqz"] Oct 02 13:01:22 crc kubenswrapper[4724]: I1002 13:01:22.245895 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-v5w5d\" (UID: \"7de6408f-76b4-4a9a-bf83-fe6c4e60848e\") " pod="openshift-image-registry/image-registry-697d97f7c8-v5w5d" Oct 02 13:01:22 crc kubenswrapper[4724]: E1002 13:01:22.251191 4724 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 13:01:22.751171671 +0000 UTC m=+147.205930872 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-v5w5d" (UID: "7de6408f-76b4-4a9a-bf83-fe6c4e60848e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 13:01:22 crc kubenswrapper[4724]: I1002 13:01:22.362072 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 13:01:22 crc kubenswrapper[4724]: E1002 13:01:22.362461 4724 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 13:01:22.862441466 +0000 UTC m=+147.317200587 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 13:01:22 crc kubenswrapper[4724]: I1002 13:01:22.463222 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-v5w5d\" (UID: \"7de6408f-76b4-4a9a-bf83-fe6c4e60848e\") " pod="openshift-image-registry/image-registry-697d97f7c8-v5w5d" Oct 02 13:01:22 crc kubenswrapper[4724]: E1002 13:01:22.463657 4724 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 13:01:22.963640626 +0000 UTC m=+147.418399737 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-v5w5d" (UID: "7de6408f-76b4-4a9a-bf83-fe6c4e60848e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 13:01:22 crc kubenswrapper[4724]: I1002 13:01:22.564147 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 13:01:22 crc kubenswrapper[4724]: E1002 13:01:22.564490 4724 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 13:01:23.064417406 +0000 UTC m=+147.519176527 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 13:01:22 crc kubenswrapper[4724]: I1002 13:01:22.564608 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-v5w5d\" (UID: \"7de6408f-76b4-4a9a-bf83-fe6c4e60848e\") " pod="openshift-image-registry/image-registry-697d97f7c8-v5w5d" Oct 02 13:01:22 crc kubenswrapper[4724]: E1002 13:01:22.565011 4724 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 13:01:23.064994771 +0000 UTC m=+147.519753892 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-v5w5d" (UID: "7de6408f-76b4-4a9a-bf83-fe6c4e60848e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 13:01:22 crc kubenswrapper[4724]: I1002 13:01:22.589068 4724 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication-operator/authentication-operator-69f744f599-st27v" podStartSLOduration=122.589048179 podStartE2EDuration="2m2.589048179s" podCreationTimestamp="2025-10-02 12:59:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 13:01:22.586918845 +0000 UTC m=+147.041677966" watchObservedRunningTime="2025-10-02 13:01:22.589048179 +0000 UTC m=+147.043807310" Oct 02 13:01:22 crc kubenswrapper[4724]: I1002 13:01:22.666371 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 13:01:22 crc kubenswrapper[4724]: E1002 13:01:22.666804 4724 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 13:01:23.166787766 +0000 UTC m=+147.621546887 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 13:01:22 crc kubenswrapper[4724]: I1002 13:01:22.752014 4724 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-qxfvd" podStartSLOduration=121.751999732 podStartE2EDuration="2m1.751999732s" podCreationTimestamp="2025-10-02 12:59:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 13:01:22.701645558 +0000 UTC m=+147.156404679" watchObservedRunningTime="2025-10-02 13:01:22.751999732 +0000 UTC m=+147.206758853" Oct 02 13:01:22 crc kubenswrapper[4724]: I1002 13:01:22.753941 4724 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-879f6c89f-zpmh7" podStartSLOduration=122.753929511 podStartE2EDuration="2m2.753929511s" podCreationTimestamp="2025-10-02 12:59:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 13:01:22.751846128 +0000 UTC m=+147.206605249" watchObservedRunningTime="2025-10-02 13:01:22.753929511 +0000 UTC m=+147.208688632" Oct 02 13:01:22 crc kubenswrapper[4724]: I1002 13:01:22.757430 4724 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-tj5h8"] Oct 02 13:01:22 crc kubenswrapper[4724]: I1002 13:01:22.768288 4724 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-qwb55"] Oct 02 13:01:22 crc kubenswrapper[4724]: I1002 13:01:22.768842 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-v5w5d\" (UID: \"7de6408f-76b4-4a9a-bf83-fe6c4e60848e\") " pod="openshift-image-registry/image-registry-697d97f7c8-v5w5d" Oct 02 13:01:22 crc kubenswrapper[4724]: I1002 13:01:22.779455 4724 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-jpfxn"] Oct 02 13:01:22 crc kubenswrapper[4724]: I1002 13:01:22.783166 4724 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-lvb24"] Oct 02 13:01:22 crc kubenswrapper[4724]: E1002 13:01:22.784342 4724 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 13:01:23.272607273 +0000 UTC m=+147.727366394 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-v5w5d" (UID: "7de6408f-76b4-4a9a-bf83-fe6c4e60848e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 13:01:22 crc kubenswrapper[4724]: I1002 13:01:22.797475 4724 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-dlnq8"] Oct 02 13:01:22 crc kubenswrapper[4724]: W1002 13:01:22.804423 4724 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf5bb856b_60df_44d0_9979_906fc271f66e.slice/crio-781385539b229e17ae3345ece50e9312b772fafef9d715e8f6d5d6274cfaf056 WatchSource:0}: Error finding container 781385539b229e17ae3345ece50e9312b772fafef9d715e8f6d5d6274cfaf056: Status 404 returned error can't find the container with id 781385539b229e17ae3345ece50e9312b772fafef9d715e8f6d5d6274cfaf056 Oct 02 13:01:22 crc kubenswrapper[4724]: I1002 13:01:22.872277 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 13:01:22 crc kubenswrapper[4724]: E1002 13:01:22.872463 4724 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 13:01:23.372447709 +0000 UTC m=+147.827206820 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 13:01:22 crc kubenswrapper[4724]: I1002 13:01:22.873795 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-v5w5d\" (UID: \"7de6408f-76b4-4a9a-bf83-fe6c4e60848e\") " pod="openshift-image-registry/image-registry-697d97f7c8-v5w5d" Oct 02 13:01:22 crc kubenswrapper[4724]: E1002 13:01:22.874178 4724 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 13:01:23.374170713 +0000 UTC m=+147.828929834 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-v5w5d" (UID: "7de6408f-76b4-4a9a-bf83-fe6c4e60848e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 13:01:22 crc kubenswrapper[4724]: I1002 13:01:22.909839 4724 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-zl9kr" podStartSLOduration=122.909815925 podStartE2EDuration="2m2.909815925s" podCreationTimestamp="2025-10-02 12:59:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 13:01:22.907714351 +0000 UTC m=+147.362473462" watchObservedRunningTime="2025-10-02 13:01:22.909815925 +0000 UTC m=+147.364575046" Oct 02 13:01:22 crc kubenswrapper[4724]: I1002 13:01:22.927300 4724 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-fh5fg"] Oct 02 13:01:22 crc kubenswrapper[4724]: I1002 13:01:22.939679 4724 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-cwvv5"] Oct 02 13:01:22 crc kubenswrapper[4724]: I1002 13:01:22.948989 4724 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-9fscb"] Oct 02 13:01:22 crc kubenswrapper[4724]: I1002 13:01:22.954504 4724 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-wjrm6"] Oct 02 13:01:22 crc kubenswrapper[4724]: I1002 13:01:22.959797 4724 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-v94bt"] Oct 02 13:01:22 crc kubenswrapper[4724]: I1002 13:01:22.974683 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 13:01:22 crc kubenswrapper[4724]: E1002 13:01:22.974894 4724 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 13:01:23.47486307 +0000 UTC m=+147.929622191 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 13:01:22 crc kubenswrapper[4724]: I1002 13:01:22.975066 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-v5w5d\" (UID: \"7de6408f-76b4-4a9a-bf83-fe6c4e60848e\") " pod="openshift-image-registry/image-registry-697d97f7c8-v5w5d" Oct 02 13:01:22 crc kubenswrapper[4724]: E1002 13:01:22.975430 4724 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 13:01:23.475422534 +0000 UTC m=+147.930181655 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-v5w5d" (UID: "7de6408f-76b4-4a9a-bf83-fe6c4e60848e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 13:01:23 crc kubenswrapper[4724]: I1002 13:01:23.076321 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 13:01:23 crc kubenswrapper[4724]: E1002 13:01:23.076665 4724 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 13:01:23.576618495 +0000 UTC m=+148.031377616 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 13:01:23 crc kubenswrapper[4724]: I1002 13:01:23.078772 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-v5w5d\" (UID: \"7de6408f-76b4-4a9a-bf83-fe6c4e60848e\") " pod="openshift-image-registry/image-registry-697d97f7c8-v5w5d" Oct 02 13:01:23 crc kubenswrapper[4724]: E1002 13:01:23.080089 4724 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 13:01:23.580074172 +0000 UTC m=+148.034833293 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-v5w5d" (UID: "7de6408f-76b4-4a9a-bf83-fe6c4e60848e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 13:01:23 crc kubenswrapper[4724]: I1002 13:01:23.138717 4724 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-z6hlf"] Oct 02 13:01:23 crc kubenswrapper[4724]: I1002 13:01:23.141258 4724 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29323500-vz2xt"] Oct 02 13:01:23 crc kubenswrapper[4724]: I1002 13:01:23.154434 4724 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-qbbfj"] Oct 02 13:01:23 crc kubenswrapper[4724]: I1002 13:01:23.186891 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 13:01:23 crc kubenswrapper[4724]: E1002 13:01:23.187429 4724 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 13:01:23.687412468 +0000 UTC m=+148.142171589 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 13:01:23 crc kubenswrapper[4724]: I1002 13:01:23.189692 4724 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-tq9w6"] Oct 02 13:01:23 crc kubenswrapper[4724]: I1002 13:01:23.191908 4724 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-m6knf"] Oct 02 13:01:23 crc kubenswrapper[4724]: I1002 13:01:23.213747 4724 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-5mxg5"] Oct 02 13:01:23 crc kubenswrapper[4724]: I1002 13:01:23.229584 4724 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/machine-api-operator-5694c8668f-x726v" podStartSLOduration=122.229560654 podStartE2EDuration="2m2.229560654s" podCreationTimestamp="2025-10-02 12:59:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 13:01:23.199211756 +0000 UTC m=+147.653970887" watchObservedRunningTime="2025-10-02 13:01:23.229560654 +0000 UTC m=+147.684319775" Oct 02 13:01:23 crc kubenswrapper[4724]: I1002 13:01:23.233397 4724 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-4rwfr"] Oct 02 13:01:23 crc kubenswrapper[4724]: I1002 13:01:23.234616 4724 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-klq6x"] Oct 02 13:01:23 crc kubenswrapper[4724]: I1002 13:01:23.235896 4724 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-lqm4v"] Oct 02 13:01:23 crc kubenswrapper[4724]: I1002 13:01:23.241054 4724 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-xnt99"] Oct 02 13:01:23 crc kubenswrapper[4724]: I1002 13:01:23.241497 4724 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-j7cp6"] Oct 02 13:01:23 crc kubenswrapper[4724]: I1002 13:01:23.264393 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-d2rqx" event={"ID":"f0cb7f77-88ee-46c4-9c81-aa953416aec1","Type":"ContainerStarted","Data":"b0703101027ab20202234b478d94d926af6e64434ae9ba652f25a16791fa18de"} Oct 02 13:01:23 crc kubenswrapper[4724]: I1002 13:01:23.264448 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-d2rqx" event={"ID":"f0cb7f77-88ee-46c4-9c81-aa953416aec1","Type":"ContainerStarted","Data":"078713cfc890cd3eb57ae0166d91c735f3d01f94486f9ea29d93a1e18a63c426"} Oct 02 13:01:23 crc kubenswrapper[4724]: I1002 13:01:23.273651 4724 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-7954f5f757-kl4xt" podStartSLOduration=123.273630809 podStartE2EDuration="2m3.273630809s" podCreationTimestamp="2025-10-02 12:59:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 13:01:23.273310251 +0000 UTC m=+147.728069392" watchObservedRunningTime="2025-10-02 13:01:23.273630809 +0000 UTC m=+147.728389930" Oct 02 13:01:23 crc kubenswrapper[4724]: I1002 13:01:23.282116 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-974gc" event={"ID":"4f4b9ef0-2fa1-4d48-81c9-e428e93c7034","Type":"ContainerStarted","Data":"2e9d17ff38a3618621c1b42012af9956a88cf002753da206b8e4271cd534e680"} Oct 02 13:01:23 crc kubenswrapper[4724]: W1002 13:01:23.288693 4724 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda38ec3fe_c0db_4d6d_94a0_f4c5dfbc33c1.slice/crio-1e3efbcdbf02a9c8aea3f14cb10526b073b1dc0e288c0ab982bf4029592904b1 WatchSource:0}: Error finding container 1e3efbcdbf02a9c8aea3f14cb10526b073b1dc0e288c0ab982bf4029592904b1: Status 404 returned error can't find the container with id 1e3efbcdbf02a9c8aea3f14cb10526b073b1dc0e288c0ab982bf4029592904b1 Oct 02 13:01:23 crc kubenswrapper[4724]: I1002 13:01:23.289161 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-v5w5d\" (UID: \"7de6408f-76b4-4a9a-bf83-fe6c4e60848e\") " pod="openshift-image-registry/image-registry-697d97f7c8-v5w5d" Oct 02 13:01:23 crc kubenswrapper[4724]: E1002 13:01:23.289509 4724 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 13:01:23.78949821 +0000 UTC m=+148.244257331 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-v5w5d" (UID: "7de6408f-76b4-4a9a-bf83-fe6c4e60848e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 13:01:23 crc kubenswrapper[4724]: I1002 13:01:23.291419 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-9fscb" event={"ID":"5a3d5592-2d16-4d75-a734-664f6dd16418","Type":"ContainerStarted","Data":"618fd36a5c92ced45a49b8b992a843746a962ffaf3c59d998648934a5786d875"} Oct 02 13:01:23 crc kubenswrapper[4724]: I1002 13:01:23.293717 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-v94bt" event={"ID":"152155f3-933d-43c5-abeb-7c06899a6939","Type":"ContainerStarted","Data":"8c0dccdb656557837a231cadae037a18386ff2de5912c00f01eab50a0f980615"} Oct 02 13:01:23 crc kubenswrapper[4724]: I1002 13:01:23.295234 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-hdbqt" event={"ID":"2b330608-20dc-445e-bf75-4393541c7fd4","Type":"ContainerStarted","Data":"b68069161238ae426392a8ff8347320ec5098a667f6c4c0ab5a222eca21c595f"} Oct 02 13:01:23 crc kubenswrapper[4724]: I1002 13:01:23.298864 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-8zmbp" event={"ID":"42570dd4-bfb7-43a7-9f10-bf0df9236925","Type":"ContainerStarted","Data":"f1de3ecf5c1acaba606b9409f6bbe6e1f50ac472c94f4ccdf860a4f2cb8e525a"} Oct 02 13:01:23 crc kubenswrapper[4724]: I1002 13:01:23.298907 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-8zmbp" event={"ID":"42570dd4-bfb7-43a7-9f10-bf0df9236925","Type":"ContainerStarted","Data":"6bd90c245777340b1ae0e86e548d5a674dd9c72ac0c313ab871beb0358ea3161"} Oct 02 13:01:23 crc kubenswrapper[4724]: I1002 13:01:23.302229 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-tj5h8" event={"ID":"59e1178f-be06-4966-9e77-031df1e58c1a","Type":"ContainerStarted","Data":"dd227409f2ce828e974ef390f7744e0985f5be8d1083bf11315167ebb2bf3cef"} Oct 02 13:01:23 crc kubenswrapper[4724]: I1002 13:01:23.320504 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-lvb24" event={"ID":"131c7969-a8c8-4cfc-b655-ac3d400fae1b","Type":"ContainerStarted","Data":"0d92c28dce6f440d4ffa1da95a544dc0626260b041f9b0e39d9ecb2670828692"} Oct 02 13:01:23 crc kubenswrapper[4724]: W1002 13:01:23.323719 4724 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb75eedff_1c9f_456b_800e_6eebbf0db535.slice/crio-cde49c5d109ec0883b5849a7c5160094faf41000059771d219755f885334b718 WatchSource:0}: Error finding container cde49c5d109ec0883b5849a7c5160094faf41000059771d219755f885334b718: Status 404 returned error can't find the container with id cde49c5d109ec0883b5849a7c5160094faf41000059771d219755f885334b718 Oct 02 13:01:23 crc kubenswrapper[4724]: I1002 13:01:23.324202 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-f2jql" event={"ID":"21e8cad3-fa39-40e9-9d04-ff4dd12c3ec9","Type":"ContainerStarted","Data":"3dcdf69efe5db410e06f45fdb1ab6c4ba7295476b436c701bd08e770a240b694"} Oct 02 13:01:23 crc kubenswrapper[4724]: W1002 13:01:23.326750 4724 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4e183eb5_2230_43b3_b8b8_b1c3aaa21370.slice/crio-aca662199ec4b35fd5dafb57cf0c0f0c04d6ee93effbfaac799cb60439ec840a WatchSource:0}: Error finding container aca662199ec4b35fd5dafb57cf0c0f0c04d6ee93effbfaac799cb60439ec840a: Status 404 returned error can't find the container with id aca662199ec4b35fd5dafb57cf0c0f0c04d6ee93effbfaac799cb60439ec840a Oct 02 13:01:23 crc kubenswrapper[4724]: I1002 13:01:23.327876 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-dlnq8" event={"ID":"5111b397-09d5-4412-9135-2ea4914c00db","Type":"ContainerStarted","Data":"68513c186055cd1d5e316eddfc620a70e1605fd458b7454ecf438803ff765a25"} Oct 02 13:01:23 crc kubenswrapper[4724]: I1002 13:01:23.332653 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-wjrm6" event={"ID":"efa1155d-cd1a-496d-94e8-eecbee129061","Type":"ContainerStarted","Data":"0021558436d3fd3086d0715bb192fd927dbcdff3027d57b04209abab68c226f9"} Oct 02 13:01:23 crc kubenswrapper[4724]: I1002 13:01:23.339213 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-vg57j" event={"ID":"7d87bc9c-fb3d-42c1-94c9-ec34e34d0f16","Type":"ContainerStarted","Data":"4f8ebada7410e05eff0ee870367e29c756916edfadbffb35237f920791b65bc6"} Oct 02 13:01:23 crc kubenswrapper[4724]: I1002 13:01:23.339545 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-vg57j" event={"ID":"7d87bc9c-fb3d-42c1-94c9-ec34e34d0f16","Type":"ContainerStarted","Data":"65df8f38939dab748f4470797dd5e44a1930876c52b2f12cec6ff3a9e9764946"} Oct 02 13:01:23 crc kubenswrapper[4724]: I1002 13:01:23.345083 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-kw2qg" event={"ID":"aa3fb3e7-b85a-4ba1-a075-4d3ae0c6eed6","Type":"ContainerStarted","Data":"b02f3a0a6d76f5038ee94d73627da40cd4045be5a04ec81b282cb1d44fac7b7f"} Oct 02 13:01:23 crc kubenswrapper[4724]: I1002 13:01:23.347611 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-cwvv5" event={"ID":"c8812b3c-bd75-49a8-b2a7-3db91675fc09","Type":"ContainerStarted","Data":"68d75e5bcf2f5ff9cfd1be682ed9ee0c8e3823c8b7a459637f17c6913ce6a56e"} Oct 02 13:01:23 crc kubenswrapper[4724]: I1002 13:01:23.350952 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-n5rln" event={"ID":"6578bd1a-eaad-452a-adf5-7f3e34838677","Type":"ContainerStarted","Data":"579e21015aab19daba5df1d0ee37cc451a39a5db218c9d885af6c0c15dc2b0a2"} Oct 02 13:01:23 crc kubenswrapper[4724]: I1002 13:01:23.356879 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-fh5fg" event={"ID":"1d28f605-0e15-46b5-9c71-b6a123c0e0ce","Type":"ContainerStarted","Data":"ce131a9f175c0b9c006e858e23267d4390c53e76b9137dd468eb92436163afbf"} Oct 02 13:01:23 crc kubenswrapper[4724]: I1002 13:01:23.361206 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-t8hf4" event={"ID":"a9c024ff-b6a9-4b92-8c1d-debc51c10ec9","Type":"ContainerStarted","Data":"2757b106f6c8aadb665d1c2318b5ffde6d33e909cb03285a850c9ef9e755792c"} Oct 02 13:01:23 crc kubenswrapper[4724]: I1002 13:01:23.373300 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-w4dqz" event={"ID":"2b666d74-c6b0-4909-83af-2b736c0e032a","Type":"ContainerStarted","Data":"478b16c0fccf0bc43329d2e6039473821d42c2d049214ec6061cdf19f7282cd6"} Oct 02 13:01:23 crc kubenswrapper[4724]: I1002 13:01:23.390319 4724 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-n4xw8" podStartSLOduration=122.390298621 podStartE2EDuration="2m2.390298621s" podCreationTimestamp="2025-10-02 12:59:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 13:01:23.390227359 +0000 UTC m=+147.844986490" watchObservedRunningTime="2025-10-02 13:01:23.390298621 +0000 UTC m=+147.845057742" Oct 02 13:01:23 crc kubenswrapper[4724]: I1002 13:01:23.390929 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 13:01:23 crc kubenswrapper[4724]: E1002 13:01:23.392357 4724 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 13:01:23.892338932 +0000 UTC m=+148.347098053 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 13:01:23 crc kubenswrapper[4724]: I1002 13:01:23.399721 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-jpfxn" event={"ID":"ac67382d-e26c-48c3-933e-19fecd4d5d49","Type":"ContainerStarted","Data":"4abbb0c9252a17c6b2c5161bd5cd347c0b25b003e2ebc4cdb5ec67c913a1d123"} Oct 02 13:01:23 crc kubenswrapper[4724]: I1002 13:01:23.403470 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-qwb55" event={"ID":"f5bb856b-60df-44d0-9979-906fc271f66e","Type":"ContainerStarted","Data":"781385539b229e17ae3345ece50e9312b772fafef9d715e8f6d5d6274cfaf056"} Oct 02 13:01:23 crc kubenswrapper[4724]: I1002 13:01:23.406290 4724 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-558db77b4-m8t6v" Oct 02 13:01:23 crc kubenswrapper[4724]: I1002 13:01:23.407259 4724 patch_prober.go:28] interesting pod/downloads-7954f5f757-kl4xt container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.12:8080/\": dial tcp 10.217.0.12:8080: connect: connection refused" start-of-body= Oct 02 13:01:23 crc kubenswrapper[4724]: I1002 13:01:23.407269 4724 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-zpmh7 container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.17:8443/healthz\": dial tcp 10.217.0.17:8443: connect: connection refused" start-of-body= Oct 02 13:01:23 crc kubenswrapper[4724]: I1002 13:01:23.407303 4724 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-kl4xt" podUID="84a3f9d9-8a1f-45ed-ad6f-3c8eb02738e7" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.12:8080/\": dial tcp 10.217.0.12:8080: connect: connection refused" Oct 02 13:01:23 crc kubenswrapper[4724]: I1002 13:01:23.407367 4724 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-zpmh7" podUID="9699ff16-3d72-4ba6-9055-6b707c3e223f" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.17:8443/healthz\": dial tcp 10.217.0.17:8443: connect: connection refused" Oct 02 13:01:23 crc kubenswrapper[4724]: I1002 13:01:23.415103 4724 patch_prober.go:28] interesting pod/oauth-openshift-558db77b4-m8t6v container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.11:6443/healthz\": dial tcp 10.217.0.11:6443: connect: connection refused" start-of-body= Oct 02 13:01:23 crc kubenswrapper[4724]: I1002 13:01:23.415163 4724 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-558db77b4-m8t6v" podUID="7aab6527-d135-45a0-8fe0-99de1fd40d3d" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.11:6443/healthz\": dial tcp 10.217.0.11:6443: connect: connection refused" Oct 02 13:01:23 crc kubenswrapper[4724]: I1002 13:01:23.492917 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-v5w5d\" (UID: \"7de6408f-76b4-4a9a-bf83-fe6c4e60848e\") " pod="openshift-image-registry/image-registry-697d97f7c8-v5w5d" Oct 02 13:01:23 crc kubenswrapper[4724]: E1002 13:01:23.496906 4724 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 13:01:23.996887417 +0000 UTC m=+148.451646568 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-v5w5d" (UID: "7de6408f-76b4-4a9a-bf83-fe6c4e60848e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 13:01:23 crc kubenswrapper[4724]: I1002 13:01:23.500351 4724 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-t8hf4" podStartSLOduration=122.500330274 podStartE2EDuration="2m2.500330274s" podCreationTimestamp="2025-10-02 12:59:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 13:01:23.499265707 +0000 UTC m=+147.954024848" watchObservedRunningTime="2025-10-02 13:01:23.500330274 +0000 UTC m=+147.955089395" Oct 02 13:01:23 crc kubenswrapper[4724]: I1002 13:01:23.542685 4724 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-558db77b4-m8t6v" podStartSLOduration=123.542668126 podStartE2EDuration="2m3.542668126s" podCreationTimestamp="2025-10-02 12:59:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 13:01:23.541199078 +0000 UTC m=+147.995958209" watchObservedRunningTime="2025-10-02 13:01:23.542668126 +0000 UTC m=+147.997427247" Oct 02 13:01:23 crc kubenswrapper[4724]: I1002 13:01:23.545646 4724 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-5444994796-n5rln" Oct 02 13:01:23 crc kubenswrapper[4724]: I1002 13:01:23.546076 4724 patch_prober.go:28] interesting pod/router-default-5444994796-n5rln container/router namespace/openshift-ingress: Startup probe status=failure output="Get \"http://localhost:1936/healthz/ready\": dial tcp [::1]:1936: connect: connection refused" start-of-body= Oct 02 13:01:23 crc kubenswrapper[4724]: I1002 13:01:23.546129 4724 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-n5rln" podUID="6578bd1a-eaad-452a-adf5-7f3e34838677" containerName="router" probeResult="failure" output="Get \"http://localhost:1936/healthz/ready\": dial tcp [::1]:1936: connect: connection refused" Oct 02 13:01:23 crc kubenswrapper[4724]: I1002 13:01:23.581763 4724 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-5444994796-n5rln" podStartSLOduration=123.581742274 podStartE2EDuration="2m3.581742274s" podCreationTimestamp="2025-10-02 12:59:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 13:01:23.580682177 +0000 UTC m=+148.035441298" watchObservedRunningTime="2025-10-02 13:01:23.581742274 +0000 UTC m=+148.036501395" Oct 02 13:01:23 crc kubenswrapper[4724]: I1002 13:01:23.594137 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 13:01:23 crc kubenswrapper[4724]: E1002 13:01:23.594865 4724 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 13:01:24.094839145 +0000 UTC m=+148.549598256 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 13:01:23 crc kubenswrapper[4724]: I1002 13:01:23.624693 4724 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-server-kw2qg" podStartSLOduration=5.62467226 podStartE2EDuration="5.62467226s" podCreationTimestamp="2025-10-02 13:01:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 13:01:23.624051684 +0000 UTC m=+148.078810845" watchObservedRunningTime="2025-10-02 13:01:23.62467226 +0000 UTC m=+148.079431381" Oct 02 13:01:23 crc kubenswrapper[4724]: I1002 13:01:23.696323 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-v5w5d\" (UID: \"7de6408f-76b4-4a9a-bf83-fe6c4e60848e\") " pod="openshift-image-registry/image-registry-697d97f7c8-v5w5d" Oct 02 13:01:23 crc kubenswrapper[4724]: E1002 13:01:23.696888 4724 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 13:01:24.196874427 +0000 UTC m=+148.651633548 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-v5w5d" (UID: "7de6408f-76b4-4a9a-bf83-fe6c4e60848e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 13:01:23 crc kubenswrapper[4724]: I1002 13:01:23.799845 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 13:01:23 crc kubenswrapper[4724]: E1002 13:01:23.800955 4724 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 13:01:24.30092925 +0000 UTC m=+148.755688381 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 13:01:23 crc kubenswrapper[4724]: I1002 13:01:23.801073 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-v5w5d\" (UID: \"7de6408f-76b4-4a9a-bf83-fe6c4e60848e\") " pod="openshift-image-registry/image-registry-697d97f7c8-v5w5d" Oct 02 13:01:23 crc kubenswrapper[4724]: E1002 13:01:23.801473 4724 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 13:01:24.301462653 +0000 UTC m=+148.756221774 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-v5w5d" (UID: "7de6408f-76b4-4a9a-bf83-fe6c4e60848e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 13:01:23 crc kubenswrapper[4724]: I1002 13:01:23.901695 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 13:01:23 crc kubenswrapper[4724]: E1002 13:01:23.901838 4724 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 13:01:24.401809662 +0000 UTC m=+148.856568793 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 13:01:23 crc kubenswrapper[4724]: I1002 13:01:23.902419 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-v5w5d\" (UID: \"7de6408f-76b4-4a9a-bf83-fe6c4e60848e\") " pod="openshift-image-registry/image-registry-697d97f7c8-v5w5d" Oct 02 13:01:23 crc kubenswrapper[4724]: E1002 13:01:23.902691 4724 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 13:01:24.402678164 +0000 UTC m=+148.857437285 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-v5w5d" (UID: "7de6408f-76b4-4a9a-bf83-fe6c4e60848e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 13:01:24 crc kubenswrapper[4724]: I1002 13:01:24.009670 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 13:01:24 crc kubenswrapper[4724]: E1002 13:01:24.010312 4724 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 13:01:24.510292406 +0000 UTC m=+148.965051537 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 13:01:24 crc kubenswrapper[4724]: I1002 13:01:24.111525 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-v5w5d\" (UID: \"7de6408f-76b4-4a9a-bf83-fe6c4e60848e\") " pod="openshift-image-registry/image-registry-697d97f7c8-v5w5d" Oct 02 13:01:24 crc kubenswrapper[4724]: E1002 13:01:24.112326 4724 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 13:01:24.612310637 +0000 UTC m=+149.067069758 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-v5w5d" (UID: "7de6408f-76b4-4a9a-bf83-fe6c4e60848e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 13:01:24 crc kubenswrapper[4724]: I1002 13:01:24.213115 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 13:01:24 crc kubenswrapper[4724]: E1002 13:01:24.213308 4724 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 13:01:24.713279882 +0000 UTC m=+149.168039003 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 13:01:24 crc kubenswrapper[4724]: I1002 13:01:24.213472 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-v5w5d\" (UID: \"7de6408f-76b4-4a9a-bf83-fe6c4e60848e\") " pod="openshift-image-registry/image-registry-697d97f7c8-v5w5d" Oct 02 13:01:24 crc kubenswrapper[4724]: E1002 13:01:24.213809 4724 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 13:01:24.713800565 +0000 UTC m=+149.168559736 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-v5w5d" (UID: "7de6408f-76b4-4a9a-bf83-fe6c4e60848e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 13:01:24 crc kubenswrapper[4724]: I1002 13:01:24.315028 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 13:01:24 crc kubenswrapper[4724]: E1002 13:01:24.315304 4724 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 13:01:24.815274072 +0000 UTC m=+149.270033203 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 13:01:24 crc kubenswrapper[4724]: I1002 13:01:24.315459 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-v5w5d\" (UID: \"7de6408f-76b4-4a9a-bf83-fe6c4e60848e\") " pod="openshift-image-registry/image-registry-697d97f7c8-v5w5d" Oct 02 13:01:24 crc kubenswrapper[4724]: E1002 13:01:24.315766 4724 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 13:01:24.815751234 +0000 UTC m=+149.270510355 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-v5w5d" (UID: "7de6408f-76b4-4a9a-bf83-fe6c4e60848e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 13:01:24 crc kubenswrapper[4724]: I1002 13:01:24.409229 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-wjrm6" event={"ID":"efa1155d-cd1a-496d-94e8-eecbee129061","Type":"ContainerStarted","Data":"f8b67d3f717d5e032393e3e2fca478a5d4de77ac56fe769f30797c161364c457"} Oct 02 13:01:24 crc kubenswrapper[4724]: I1002 13:01:24.410516 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-j7cp6" event={"ID":"98449ccf-cf29-44ab-9400-994b04309bb5","Type":"ContainerStarted","Data":"e77903f107437bc26aab15e6c9aef487702655bc215cf88d882a0c9b4ad8d561"} Oct 02 13:01:24 crc kubenswrapper[4724]: I1002 13:01:24.410605 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-j7cp6" event={"ID":"98449ccf-cf29-44ab-9400-994b04309bb5","Type":"ContainerStarted","Data":"f0795d374d61fbbc89327ed845ee7cf6f1596a6f1b00cb5577bfb793908baa52"} Oct 02 13:01:24 crc kubenswrapper[4724]: I1002 13:01:24.411279 4724 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-j7cp6" Oct 02 13:01:24 crc kubenswrapper[4724]: I1002 13:01:24.411933 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-lqm4v" event={"ID":"324d6aa8-e27f-49ab-9b8e-c72665cc2f34","Type":"ContainerStarted","Data":"49df1c2a46e8f6262c1603692a1b29cec95b0e34cb6edccc1385d384149ecfc4"} Oct 02 13:01:24 crc kubenswrapper[4724]: I1002 13:01:24.411962 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-lqm4v" event={"ID":"324d6aa8-e27f-49ab-9b8e-c72665cc2f34","Type":"ContainerStarted","Data":"a5358a5ad86c2a577f14940cb68b6e948b57fa02ad462754f560e650b943ed80"} Oct 02 13:01:24 crc kubenswrapper[4724]: I1002 13:01:24.412919 4724 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-j7cp6 container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.33:8080/healthz\": dial tcp 10.217.0.33:8080: connect: connection refused" start-of-body= Oct 02 13:01:24 crc kubenswrapper[4724]: I1002 13:01:24.412963 4724 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-j7cp6" podUID="98449ccf-cf29-44ab-9400-994b04309bb5" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.33:8080/healthz\": dial tcp 10.217.0.33:8080: connect: connection refused" Oct 02 13:01:24 crc kubenswrapper[4724]: I1002 13:01:24.414291 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-f2jql" event={"ID":"21e8cad3-fa39-40e9-9d04-ff4dd12c3ec9","Type":"ContainerStarted","Data":"c214487498261759365c8c927fa8ede2775850ff44de707bcf8e2da0a4a23ab8"} Oct 02 13:01:24 crc kubenswrapper[4724]: I1002 13:01:24.416584 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 13:01:24 crc kubenswrapper[4724]: E1002 13:01:24.416746 4724 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 13:01:24.916713209 +0000 UTC m=+149.371472340 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 13:01:24 crc kubenswrapper[4724]: I1002 13:01:24.417103 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-v5w5d\" (UID: \"7de6408f-76b4-4a9a-bf83-fe6c4e60848e\") " pod="openshift-image-registry/image-registry-697d97f7c8-v5w5d" Oct 02 13:01:24 crc kubenswrapper[4724]: E1002 13:01:24.417402 4724 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 13:01:24.917390406 +0000 UTC m=+149.372149527 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-v5w5d" (UID: "7de6408f-76b4-4a9a-bf83-fe6c4e60848e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 13:01:24 crc kubenswrapper[4724]: I1002 13:01:24.419218 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-dlnq8" event={"ID":"5111b397-09d5-4412-9135-2ea4914c00db","Type":"ContainerStarted","Data":"e4c83928fa62efaca6c26e8d652de528885157c45152e05b8520dfb54b23ad35"} Oct 02 13:01:24 crc kubenswrapper[4724]: I1002 13:01:24.421398 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-xjrx9" event={"ID":"da7051e3-8a79-43e7-9016-9d492b51a9fd","Type":"ContainerStarted","Data":"88d7cd633fe0ba33b5658c5e9f0e3812f5a1b4092cb4fe138e782c498e474df4"} Oct 02 13:01:24 crc kubenswrapper[4724]: I1002 13:01:24.422891 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-klq6x" event={"ID":"b75eedff-1c9f-456b-800e-6eebbf0db535","Type":"ContainerStarted","Data":"cde49c5d109ec0883b5849a7c5160094faf41000059771d219755f885334b718"} Oct 02 13:01:24 crc kubenswrapper[4724]: I1002 13:01:24.424653 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-lvb24" event={"ID":"131c7969-a8c8-4cfc-b655-ac3d400fae1b","Type":"ContainerStarted","Data":"ce7c08e152d55dcc8833669a077a22b6ea53395153cfe2ea5cc39d0a947d4bd1"} Oct 02 13:01:24 crc kubenswrapper[4724]: I1002 13:01:24.427316 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-hdbqt" event={"ID":"2b330608-20dc-445e-bf75-4393541c7fd4","Type":"ContainerStarted","Data":"3f7fa33dcc07f872bf0ecdcc5789b0fc5bcfb7c5adb13c01f0f1311610bd4312"} Oct 02 13:01:24 crc kubenswrapper[4724]: I1002 13:01:24.428807 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-z6hlf" event={"ID":"89edffbe-3576-4a6e-8bd6-884ee6fbc58d","Type":"ContainerStarted","Data":"05b9710fd4331481721c7c11139d54527367e9a4ac2e8e25c22c9ebd26263b94"} Oct 02 13:01:24 crc kubenswrapper[4724]: I1002 13:01:24.428843 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-z6hlf" event={"ID":"89edffbe-3576-4a6e-8bd6-884ee6fbc58d","Type":"ContainerStarted","Data":"3c653a0a0a39e3f30dd6d09c2a887bff793e6e303078e18870b0fbde7c5ed789"} Oct 02 13:01:24 crc kubenswrapper[4724]: I1002 13:01:24.429045 4724 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-wjrm6" podStartSLOduration=123.42902544 podStartE2EDuration="2m3.42902544s" podCreationTimestamp="2025-10-02 12:59:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 13:01:24.428116217 +0000 UTC m=+148.882875338" watchObservedRunningTime="2025-10-02 13:01:24.42902544 +0000 UTC m=+148.883784561" Oct 02 13:01:24 crc kubenswrapper[4724]: I1002 13:01:24.430006 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-fh5fg" event={"ID":"1d28f605-0e15-46b5-9c71-b6a123c0e0ce","Type":"ContainerStarted","Data":"5d7bdee190827aa5aa03eb1cc07e1df15aba67261a88ee607a3d9ebd13b401c3"} Oct 02 13:01:24 crc kubenswrapper[4724]: I1002 13:01:24.430953 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-tq9w6" event={"ID":"8fa4c4f2-9475-482e-b428-c2ec0abc2842","Type":"ContainerStarted","Data":"fb7d04731bf6d7b70332076e827a5d7d264216489728de8d347d9e1348852315"} Oct 02 13:01:24 crc kubenswrapper[4724]: I1002 13:01:24.432196 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-vg57j" event={"ID":"7d87bc9c-fb3d-42c1-94c9-ec34e34d0f16","Type":"ContainerStarted","Data":"1fffe8b7d183723f37ea4666a0be7d90b5dc9ec759d52c0ed7eea83625ab5ab2"} Oct 02 13:01:24 crc kubenswrapper[4724]: I1002 13:01:24.433162 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-qbbfj" event={"ID":"a38ec3fe-c0db-4d6d-94a0-f4c5dfbc33c1","Type":"ContainerStarted","Data":"b164b7521a3e658374f983a0f159ddccce542111c895c4b0a8915059ee23190d"} Oct 02 13:01:24 crc kubenswrapper[4724]: I1002 13:01:24.433185 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-qbbfj" event={"ID":"a38ec3fe-c0db-4d6d-94a0-f4c5dfbc33c1","Type":"ContainerStarted","Data":"1e3efbcdbf02a9c8aea3f14cb10526b073b1dc0e288c0ab982bf4029592904b1"} Oct 02 13:01:24 crc kubenswrapper[4724]: I1002 13:01:24.434761 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-4lzm2" event={"ID":"499ba1de-99e3-4a0c-be96-866d0127402d","Type":"ContainerStarted","Data":"c175f68a59de5f59147ca6aeef52ec63d9c469a319f376a7923824fa308793cd"} Oct 02 13:01:24 crc kubenswrapper[4724]: I1002 13:01:24.435702 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-xnt99" event={"ID":"4e183eb5-2230-43b3-b8b8-b1c3aaa21370","Type":"ContainerStarted","Data":"aca662199ec4b35fd5dafb57cf0c0f0c04d6ee93effbfaac799cb60439ec840a"} Oct 02 13:01:24 crc kubenswrapper[4724]: I1002 13:01:24.437350 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-w4dqz" event={"ID":"2b666d74-c6b0-4909-83af-2b736c0e032a","Type":"ContainerStarted","Data":"8561e48495139d77a541f9dc8d19393ccd6a12e20d401013dc020b885fd9da16"} Oct 02 13:01:24 crc kubenswrapper[4724]: I1002 13:01:24.438732 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29323500-vz2xt" event={"ID":"59bd4d63-57bf-4b08-b7f1-f0c7d733e571","Type":"ContainerStarted","Data":"b0b9f435d0980740c089d63f6cd9fc7e33fbde41c7fa275fe2ad3c87a57bd9bd"} Oct 02 13:01:24 crc kubenswrapper[4724]: I1002 13:01:24.438765 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29323500-vz2xt" event={"ID":"59bd4d63-57bf-4b08-b7f1-f0c7d733e571","Type":"ContainerStarted","Data":"a524d6b446d7b90cbd2429b18f49ceb830c84f9639ea535593aea7925d92ffe2"} Oct 02 13:01:24 crc kubenswrapper[4724]: I1002 13:01:24.441864 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-9fscb" event={"ID":"5a3d5592-2d16-4d75-a734-664f6dd16418","Type":"ContainerStarted","Data":"ff58ee0252e519c35930064671ecb732520fe05b40572aee681f2d79143e7040"} Oct 02 13:01:24 crc kubenswrapper[4724]: I1002 13:01:24.451827 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-m6knf" event={"ID":"0fe37aaa-cc86-416c-9718-97b43f158977","Type":"ContainerStarted","Data":"98e50056c206e8bcaf507ae9216322471b0ff8af5abfb2c7428a97d3eb682027"} Oct 02 13:01:24 crc kubenswrapper[4724]: I1002 13:01:24.451887 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-m6knf" event={"ID":"0fe37aaa-cc86-416c-9718-97b43f158977","Type":"ContainerStarted","Data":"910ec5a292b7de742f451b2d798dbea81d1bee623bcb6c635dfa4b0ce75a113d"} Oct 02 13:01:24 crc kubenswrapper[4724]: I1002 13:01:24.464543 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-v94bt" event={"ID":"152155f3-933d-43c5-abeb-7c06899a6939","Type":"ContainerStarted","Data":"7d2f802c10cd42f3b912497c41cc81b34c985c4b5be6c577e7d45ea3c5cffc69"} Oct 02 13:01:24 crc kubenswrapper[4724]: I1002 13:01:24.473949 4724 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-j7cp6" podStartSLOduration=123.473928406 podStartE2EDuration="2m3.473928406s" podCreationTimestamp="2025-10-02 12:59:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 13:01:24.455333386 +0000 UTC m=+148.910092507" watchObservedRunningTime="2025-10-02 13:01:24.473928406 +0000 UTC m=+148.928687527" Oct 02 13:01:24 crc kubenswrapper[4724]: I1002 13:01:24.486871 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-5mxg5" event={"ID":"8c9cecbe-09ba-4a38-a541-f897b225f416","Type":"ContainerStarted","Data":"a33301017f5f2f3fc9b87702a879822afce6e433f15b49c160e5d42bc2328e20"} Oct 02 13:01:24 crc kubenswrapper[4724]: I1002 13:01:24.486930 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-5mxg5" event={"ID":"8c9cecbe-09ba-4a38-a541-f897b225f416","Type":"ContainerStarted","Data":"ad6c023443944355a41611fbdcaad531bd68a7d03aea9b71acce3f9d56372b9a"} Oct 02 13:01:24 crc kubenswrapper[4724]: I1002 13:01:24.494554 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-qwb55" event={"ID":"f5bb856b-60df-44d0-9979-906fc271f66e","Type":"ContainerStarted","Data":"b95ce7b7e15bb983caad4459cd0e311e67cfae4dea583d328f56ccfeb5e25093"} Oct 02 13:01:24 crc kubenswrapper[4724]: I1002 13:01:24.502336 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-tj5h8" event={"ID":"59e1178f-be06-4966-9e77-031df1e58c1a","Type":"ContainerStarted","Data":"97c5588ca68f83053441c7b87fe7f1a87da3514e6e55aed6d0d61c4b0e9b44d2"} Oct 02 13:01:24 crc kubenswrapper[4724]: I1002 13:01:24.502406 4724 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-f9d7485db-lvb24" podStartSLOduration=124.502394986 podStartE2EDuration="2m4.502394986s" podCreationTimestamp="2025-10-02 12:59:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 13:01:24.500698853 +0000 UTC m=+148.955457974" watchObservedRunningTime="2025-10-02 13:01:24.502394986 +0000 UTC m=+148.957154097" Oct 02 13:01:24 crc kubenswrapper[4724]: I1002 13:01:24.519811 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 13:01:24 crc kubenswrapper[4724]: E1002 13:01:24.521107 4724 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 13:01:25.021088639 +0000 UTC m=+149.475847760 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 13:01:24 crc kubenswrapper[4724]: I1002 13:01:24.523736 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-4rwfr" event={"ID":"a939e0cc-650f-4fb4-9a13-bcbf29ebdb76","Type":"ContainerStarted","Data":"1e93c88463022f9a8b16d42dbdedd1fd4ccfff17d9e02e5183f7b6fa1bfc53f6"} Oct 02 13:01:24 crc kubenswrapper[4724]: I1002 13:01:24.533487 4724 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver/apiserver-76f77b778f-xjrx9" podStartSLOduration=124.533469473 podStartE2EDuration="2m4.533469473s" podCreationTimestamp="2025-10-02 12:59:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 13:01:24.533212876 +0000 UTC m=+148.987972007" watchObservedRunningTime="2025-10-02 13:01:24.533469473 +0000 UTC m=+148.988228604" Oct 02 13:01:24 crc kubenswrapper[4724]: I1002 13:01:24.536813 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-jpfxn" event={"ID":"ac67382d-e26c-48c3-933e-19fecd4d5d49","Type":"ContainerStarted","Data":"1571a324e8f56510546cdec3e6a07d79e55945829a523f0a0fa013ecb52299b7"} Oct 02 13:01:24 crc kubenswrapper[4724]: I1002 13:01:24.538096 4724 patch_prober.go:28] interesting pod/oauth-openshift-558db77b4-m8t6v container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.11:6443/healthz\": dial tcp 10.217.0.11:6443: connect: connection refused" start-of-body= Oct 02 13:01:24 crc kubenswrapper[4724]: I1002 13:01:24.538211 4724 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-558db77b4-m8t6v" podUID="7aab6527-d135-45a0-8fe0-99de1fd40d3d" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.11:6443/healthz\": dial tcp 10.217.0.11:6443: connect: connection refused" Oct 02 13:01:24 crc kubenswrapper[4724]: I1002 13:01:24.538583 4724 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-d2rqx" Oct 02 13:01:24 crc kubenswrapper[4724]: I1002 13:01:24.538887 4724 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-config-operator/openshift-config-operator-7777fb866f-974gc" Oct 02 13:01:24 crc kubenswrapper[4724]: I1002 13:01:24.541686 4724 patch_prober.go:28] interesting pod/downloads-7954f5f757-kl4xt container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.12:8080/\": dial tcp 10.217.0.12:8080: connect: connection refused" start-of-body= Oct 02 13:01:24 crc kubenswrapper[4724]: I1002 13:01:24.541787 4724 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-kl4xt" podUID="84a3f9d9-8a1f-45ed-ad6f-3c8eb02738e7" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.12:8080/\": dial tcp 10.217.0.12:8080: connect: connection refused" Oct 02 13:01:24 crc kubenswrapper[4724]: I1002 13:01:24.543482 4724 patch_prober.go:28] interesting pod/olm-operator-6b444d44fb-d2rqx container/olm-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.22:8443/healthz\": dial tcp 10.217.0.22:8443: connect: connection refused" start-of-body= Oct 02 13:01:24 crc kubenswrapper[4724]: I1002 13:01:24.543659 4724 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-d2rqx" podUID="f0cb7f77-88ee-46c4-9c81-aa953416aec1" containerName="olm-operator" probeResult="failure" output="Get \"https://10.217.0.22:8443/healthz\": dial tcp 10.217.0.22:8443: connect: connection refused" Oct 02 13:01:24 crc kubenswrapper[4724]: I1002 13:01:24.549626 4724 patch_prober.go:28] interesting pod/router-default-5444994796-n5rln container/router namespace/openshift-ingress: Startup probe status=failure output="Get \"http://localhost:1936/healthz/ready\": dial tcp [::1]:1936: connect: connection refused" start-of-body= Oct 02 13:01:24 crc kubenswrapper[4724]: I1002 13:01:24.549685 4724 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-n5rln" podUID="6578bd1a-eaad-452a-adf5-7f3e34838677" containerName="router" probeResult="failure" output="Get \"http://localhost:1936/healthz/ready\": dial tcp [::1]:1936: connect: connection refused" Oct 02 13:01:24 crc kubenswrapper[4724]: I1002 13:01:24.581242 4724 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-lqm4v" podStartSLOduration=6.581222201 podStartE2EDuration="6.581222201s" podCreationTimestamp="2025-10-02 13:01:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 13:01:24.558885445 +0000 UTC m=+149.013644566" watchObservedRunningTime="2025-10-02 13:01:24.581222201 +0000 UTC m=+149.035981322" Oct 02 13:01:24 crc kubenswrapper[4724]: I1002 13:01:24.582211 4724 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-dlnq8" podStartSLOduration=124.582205906 podStartE2EDuration="2m4.582205906s" podCreationTimestamp="2025-10-02 12:59:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 13:01:24.580708698 +0000 UTC m=+149.035467819" watchObservedRunningTime="2025-10-02 13:01:24.582205906 +0000 UTC m=+149.036965017" Oct 02 13:01:24 crc kubenswrapper[4724]: I1002 13:01:24.605956 4724 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-d2rqx" podStartSLOduration=123.605933206 podStartE2EDuration="2m3.605933206s" podCreationTimestamp="2025-10-02 12:59:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 13:01:24.604745036 +0000 UTC m=+149.059504167" watchObservedRunningTime="2025-10-02 13:01:24.605933206 +0000 UTC m=+149.060692327" Oct 02 13:01:24 crc kubenswrapper[4724]: I1002 13:01:24.621735 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-v5w5d\" (UID: \"7de6408f-76b4-4a9a-bf83-fe6c4e60848e\") " pod="openshift-image-registry/image-registry-697d97f7c8-v5w5d" Oct 02 13:01:24 crc kubenswrapper[4724]: E1002 13:01:24.624902 4724 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 13:01:25.124883715 +0000 UTC m=+149.579642946 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-v5w5d" (UID: "7de6408f-76b4-4a9a-bf83-fe6c4e60848e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 13:01:24 crc kubenswrapper[4724]: I1002 13:01:24.648689 4724 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29323500-vz2xt" podStartSLOduration=84.648671227 podStartE2EDuration="1m24.648671227s" podCreationTimestamp="2025-10-02 13:00:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 13:01:24.645829085 +0000 UTC m=+149.100588206" watchObservedRunningTime="2025-10-02 13:01:24.648671227 +0000 UTC m=+149.103430348" Oct 02 13:01:24 crc kubenswrapper[4724]: I1002 13:01:24.664856 4724 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-qwb55" podStartSLOduration=124.664838226 podStartE2EDuration="2m4.664838226s" podCreationTimestamp="2025-10-02 12:59:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 13:01:24.664117118 +0000 UTC m=+149.118876239" watchObservedRunningTime="2025-10-02 13:01:24.664838226 +0000 UTC m=+149.119597347" Oct 02 13:01:24 crc kubenswrapper[4724]: I1002 13:01:24.726012 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 13:01:24 crc kubenswrapper[4724]: E1002 13:01:24.727065 4724 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 13:01:25.22704821 +0000 UTC m=+149.681807331 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 13:01:24 crc kubenswrapper[4724]: I1002 13:01:24.786379 4724 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-config-operator/openshift-config-operator-7777fb866f-974gc" podStartSLOduration=124.786356619 podStartE2EDuration="2m4.786356619s" podCreationTimestamp="2025-10-02 12:59:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 13:01:24.784893252 +0000 UTC m=+149.239652363" watchObservedRunningTime="2025-10-02 13:01:24.786356619 +0000 UTC m=+149.241115740" Oct 02 13:01:24 crc kubenswrapper[4724]: I1002 13:01:24.786709 4724 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-fh5fg" podStartSLOduration=123.786702928 podStartE2EDuration="2m3.786702928s" podCreationTimestamp="2025-10-02 12:59:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 13:01:24.698789115 +0000 UTC m=+149.153548236" watchObservedRunningTime="2025-10-02 13:01:24.786702928 +0000 UTC m=+149.241462049" Oct 02 13:01:24 crc kubenswrapper[4724]: I1002 13:01:24.827716 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-v5w5d\" (UID: \"7de6408f-76b4-4a9a-bf83-fe6c4e60848e\") " pod="openshift-image-registry/image-registry-697d97f7c8-v5w5d" Oct 02 13:01:24 crc kubenswrapper[4724]: E1002 13:01:24.828224 4724 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 13:01:25.328200828 +0000 UTC m=+149.782959989 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-v5w5d" (UID: "7de6408f-76b4-4a9a-bf83-fe6c4e60848e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 13:01:24 crc kubenswrapper[4724]: I1002 13:01:24.837895 4724 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-tj5h8" podStartSLOduration=124.837874793 podStartE2EDuration="2m4.837874793s" podCreationTimestamp="2025-10-02 12:59:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 13:01:24.835577835 +0000 UTC m=+149.290336966" watchObservedRunningTime="2025-10-02 13:01:24.837874793 +0000 UTC m=+149.292633914" Oct 02 13:01:24 crc kubenswrapper[4724]: I1002 13:01:24.879053 4724 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd-operator/etcd-operator-b45778765-9fscb" podStartSLOduration=124.879031924 podStartE2EDuration="2m4.879031924s" podCreationTimestamp="2025-10-02 12:59:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 13:01:24.878818509 +0000 UTC m=+149.333577630" watchObservedRunningTime="2025-10-02 13:01:24.879031924 +0000 UTC m=+149.333791055" Oct 02 13:01:24 crc kubenswrapper[4724]: I1002 13:01:24.899389 4724 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-jpfxn" podStartSLOduration=124.899369029 podStartE2EDuration="2m4.899369029s" podCreationTimestamp="2025-10-02 12:59:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 13:01:24.896945447 +0000 UTC m=+149.351704568" watchObservedRunningTime="2025-10-02 13:01:24.899369029 +0000 UTC m=+149.354128150" Oct 02 13:01:24 crc kubenswrapper[4724]: I1002 13:01:24.920181 4724 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-8zmbp" podStartSLOduration=123.920159185 podStartE2EDuration="2m3.920159185s" podCreationTimestamp="2025-10-02 12:59:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 13:01:24.917092757 +0000 UTC m=+149.371851898" watchObservedRunningTime="2025-10-02 13:01:24.920159185 +0000 UTC m=+149.374918306" Oct 02 13:01:24 crc kubenswrapper[4724]: I1002 13:01:24.929719 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 13:01:24 crc kubenswrapper[4724]: E1002 13:01:24.930179 4724 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 13:01:25.430159948 +0000 UTC m=+149.884919069 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 13:01:25 crc kubenswrapper[4724]: I1002 13:01:25.033193 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-v5w5d\" (UID: \"7de6408f-76b4-4a9a-bf83-fe6c4e60848e\") " pod="openshift-image-registry/image-registry-697d97f7c8-v5w5d" Oct 02 13:01:25 crc kubenswrapper[4724]: E1002 13:01:25.033682 4724 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 13:01:25.533659286 +0000 UTC m=+149.988418417 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-v5w5d" (UID: "7de6408f-76b4-4a9a-bf83-fe6c4e60848e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 13:01:25 crc kubenswrapper[4724]: I1002 13:01:25.047450 4724 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-apiserver/apiserver-76f77b778f-xjrx9" Oct 02 13:01:25 crc kubenswrapper[4724]: I1002 13:01:25.049530 4724 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-apiserver/apiserver-76f77b778f-xjrx9" Oct 02 13:01:25 crc kubenswrapper[4724]: I1002 13:01:25.060381 4724 patch_prober.go:28] interesting pod/apiserver-76f77b778f-xjrx9 container/openshift-apiserver namespace/openshift-apiserver: Startup probe status=failure output="Get \"https://10.217.0.6:8443/livez\": dial tcp 10.217.0.6:8443: connect: connection refused" start-of-body= Oct 02 13:01:25 crc kubenswrapper[4724]: I1002 13:01:25.060446 4724 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-apiserver/apiserver-76f77b778f-xjrx9" podUID="da7051e3-8a79-43e7-9016-9d492b51a9fd" containerName="openshift-apiserver" probeResult="failure" output="Get \"https://10.217.0.6:8443/livez\": dial tcp 10.217.0.6:8443: connect: connection refused" Oct 02 13:01:25 crc kubenswrapper[4724]: I1002 13:01:25.134433 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 13:01:25 crc kubenswrapper[4724]: E1002 13:01:25.134657 4724 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 13:01:25.634624771 +0000 UTC m=+150.089383902 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 13:01:25 crc kubenswrapper[4724]: I1002 13:01:25.135139 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-v5w5d\" (UID: \"7de6408f-76b4-4a9a-bf83-fe6c4e60848e\") " pod="openshift-image-registry/image-registry-697d97f7c8-v5w5d" Oct 02 13:01:25 crc kubenswrapper[4724]: E1002 13:01:25.135488 4724 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 13:01:25.635474542 +0000 UTC m=+150.090233663 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-v5w5d" (UID: "7de6408f-76b4-4a9a-bf83-fe6c4e60848e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 13:01:25 crc kubenswrapper[4724]: I1002 13:01:25.145133 4724 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-n4xw8" Oct 02 13:01:25 crc kubenswrapper[4724]: I1002 13:01:25.145188 4724 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-n4xw8" Oct 02 13:01:25 crc kubenswrapper[4724]: I1002 13:01:25.147010 4724 patch_prober.go:28] interesting pod/apiserver-7bbb656c7d-n4xw8 container/oauth-apiserver namespace/openshift-oauth-apiserver: Startup probe status=failure output="Get \"https://10.217.0.8:8443/livez\": dial tcp 10.217.0.8:8443: connect: connection refused" start-of-body= Oct 02 13:01:25 crc kubenswrapper[4724]: I1002 13:01:25.147081 4724 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-n4xw8" podUID="4501d69c-964b-4444-b8af-d56b9301a685" containerName="oauth-apiserver" probeResult="failure" output="Get \"https://10.217.0.8:8443/livez\": dial tcp 10.217.0.8:8443: connect: connection refused" Oct 02 13:01:25 crc kubenswrapper[4724]: I1002 13:01:25.235885 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 13:01:25 crc kubenswrapper[4724]: E1002 13:01:25.236742 4724 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 13:01:25.736708483 +0000 UTC m=+150.191467614 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 13:01:25 crc kubenswrapper[4724]: I1002 13:01:25.337525 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-v5w5d\" (UID: \"7de6408f-76b4-4a9a-bf83-fe6c4e60848e\") " pod="openshift-image-registry/image-registry-697d97f7c8-v5w5d" Oct 02 13:01:25 crc kubenswrapper[4724]: E1002 13:01:25.338128 4724 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 13:01:25.838107059 +0000 UTC m=+150.292866250 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-v5w5d" (UID: "7de6408f-76b4-4a9a-bf83-fe6c4e60848e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 13:01:25 crc kubenswrapper[4724]: I1002 13:01:25.439127 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 13:01:25 crc kubenswrapper[4724]: E1002 13:01:25.439318 4724 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 13:01:25.939288018 +0000 UTC m=+150.394047139 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 13:01:25 crc kubenswrapper[4724]: I1002 13:01:25.439505 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-v5w5d\" (UID: \"7de6408f-76b4-4a9a-bf83-fe6c4e60848e\") " pod="openshift-image-registry/image-registry-697d97f7c8-v5w5d" Oct 02 13:01:25 crc kubenswrapper[4724]: E1002 13:01:25.439810 4724 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 13:01:25.939802231 +0000 UTC m=+150.394561342 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-v5w5d" (UID: "7de6408f-76b4-4a9a-bf83-fe6c4e60848e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 13:01:25 crc kubenswrapper[4724]: I1002 13:01:25.540684 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 13:01:25 crc kubenswrapper[4724]: E1002 13:01:25.540931 4724 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 13:01:26.040903889 +0000 UTC m=+150.495663010 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 13:01:25 crc kubenswrapper[4724]: I1002 13:01:25.541090 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-v5w5d\" (UID: \"7de6408f-76b4-4a9a-bf83-fe6c4e60848e\") " pod="openshift-image-registry/image-registry-697d97f7c8-v5w5d" Oct 02 13:01:25 crc kubenswrapper[4724]: E1002 13:01:25.541434 4724 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 13:01:26.041420312 +0000 UTC m=+150.496179433 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-v5w5d" (UID: "7de6408f-76b4-4a9a-bf83-fe6c4e60848e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 13:01:25 crc kubenswrapper[4724]: I1002 13:01:25.543911 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-4lzm2" event={"ID":"499ba1de-99e3-4a0c-be96-866d0127402d","Type":"ContainerStarted","Data":"331075d82a6ca6a703bc72ba1a3f753a79aa6e9557313cfe67162deadfaf5d14"} Oct 02 13:01:25 crc kubenswrapper[4724]: I1002 13:01:25.545707 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-xnt99" event={"ID":"4e183eb5-2230-43b3-b8b8-b1c3aaa21370","Type":"ContainerStarted","Data":"4d7c5f2eef0c9f4e4253b361bf47b18fd1dde19dd9f5e588d9d841a63a45dcfe"} Oct 02 13:01:25 crc kubenswrapper[4724]: I1002 13:01:25.545749 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-xnt99" event={"ID":"4e183eb5-2230-43b3-b8b8-b1c3aaa21370","Type":"ContainerStarted","Data":"8fe9a49ba980e396a0794a57fd6a0fa969f83b1e8adfbd6188ce49f97e122bcb"} Oct 02 13:01:25 crc kubenswrapper[4724]: I1002 13:01:25.547410 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-hdbqt" event={"ID":"2b330608-20dc-445e-bf75-4393541c7fd4","Type":"ContainerStarted","Data":"6cc7e6e98b67044adb477b77158e6a586274b3b14ade8eab21bd5813ff8bbeb1"} Oct 02 13:01:25 crc kubenswrapper[4724]: I1002 13:01:25.549260 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-5mxg5" event={"ID":"8c9cecbe-09ba-4a38-a541-f897b225f416","Type":"ContainerStarted","Data":"4153edf15fe2d16e34992f3a88206b1e86552ba28410caac7b8c8687b05b9256"} Oct 02 13:01:25 crc kubenswrapper[4724]: I1002 13:01:25.550823 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-f2jql" event={"ID":"21e8cad3-fa39-40e9-9d04-ff4dd12c3ec9","Type":"ContainerStarted","Data":"8f10b09ee21ea8800e3f4385653ee59cd4b380642f797214ad2e7dc034e83727"} Oct 02 13:01:25 crc kubenswrapper[4724]: I1002 13:01:25.552419 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-4rwfr" event={"ID":"a939e0cc-650f-4fb4-9a13-bcbf29ebdb76","Type":"ContainerStarted","Data":"a4cfe63a72dba88f12eaaac8bcebb167683ea1ce438a3b731274f978a198355b"} Oct 02 13:01:25 crc kubenswrapper[4724]: I1002 13:01:25.553855 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-w4dqz" event={"ID":"2b666d74-c6b0-4909-83af-2b736c0e032a","Type":"ContainerStarted","Data":"76f7dd179d348944b88a915cb9ff83c2688a0f0f11522a3c498180279f9bbf4b"} Oct 02 13:01:25 crc kubenswrapper[4724]: I1002 13:01:25.555760 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-tq9w6" event={"ID":"8fa4c4f2-9475-482e-b428-c2ec0abc2842","Type":"ContainerStarted","Data":"a71305084ef06bc68150ee70c8737bb48959a3d112c28618a000dc8747dd065c"} Oct 02 13:01:25 crc kubenswrapper[4724]: I1002 13:01:25.555789 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-tq9w6" event={"ID":"8fa4c4f2-9475-482e-b428-c2ec0abc2842","Type":"ContainerStarted","Data":"960e56104ce91cbf56b0ddaf11846e91b01f65d72781efc1bca80d06d856e111"} Oct 02 13:01:25 crc kubenswrapper[4724]: I1002 13:01:25.556207 4724 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-dns/dns-default-tq9w6" Oct 02 13:01:25 crc kubenswrapper[4724]: I1002 13:01:25.558878 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-qbbfj" event={"ID":"a38ec3fe-c0db-4d6d-94a0-f4c5dfbc33c1","Type":"ContainerStarted","Data":"64218a0153a220c072737d5516c1532a2032e9a520de9d87cc2ed337346f7358"} Oct 02 13:01:25 crc kubenswrapper[4724]: I1002 13:01:25.559091 4724 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-qbbfj" Oct 02 13:01:25 crc kubenswrapper[4724]: I1002 13:01:25.559095 4724 patch_prober.go:28] interesting pod/router-default-5444994796-n5rln container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 02 13:01:25 crc kubenswrapper[4724]: [-]has-synced failed: reason withheld Oct 02 13:01:25 crc kubenswrapper[4724]: [+]process-running ok Oct 02 13:01:25 crc kubenswrapper[4724]: healthz check failed Oct 02 13:01:25 crc kubenswrapper[4724]: I1002 13:01:25.559193 4724 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-n5rln" podUID="6578bd1a-eaad-452a-adf5-7f3e34838677" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 02 13:01:25 crc kubenswrapper[4724]: I1002 13:01:25.560720 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-klq6x" event={"ID":"b75eedff-1c9f-456b-800e-6eebbf0db535","Type":"ContainerStarted","Data":"5febb604edd67178c9bd9f2413a8c49eb0fa3d9a6f737c94d79edc0be3801625"} Oct 02 13:01:25 crc kubenswrapper[4724]: I1002 13:01:25.561518 4724 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console-operator/console-operator-58897d9998-v94bt" Oct 02 13:01:25 crc kubenswrapper[4724]: I1002 13:01:25.561563 4724 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-j7cp6 container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.33:8080/healthz\": dial tcp 10.217.0.33:8080: connect: connection refused" start-of-body= Oct 02 13:01:25 crc kubenswrapper[4724]: I1002 13:01:25.561609 4724 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-j7cp6" podUID="98449ccf-cf29-44ab-9400-994b04309bb5" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.33:8080/healthz\": dial tcp 10.217.0.33:8080: connect: connection refused" Oct 02 13:01:25 crc kubenswrapper[4724]: I1002 13:01:25.562143 4724 patch_prober.go:28] interesting pod/olm-operator-6b444d44fb-d2rqx container/olm-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.22:8443/healthz\": dial tcp 10.217.0.22:8443: connect: connection refused" start-of-body= Oct 02 13:01:25 crc kubenswrapper[4724]: I1002 13:01:25.562182 4724 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-d2rqx" podUID="f0cb7f77-88ee-46c4-9c81-aa953416aec1" containerName="olm-operator" probeResult="failure" output="Get \"https://10.217.0.22:8443/healthz\": dial tcp 10.217.0.22:8443: connect: connection refused" Oct 02 13:01:25 crc kubenswrapper[4724]: I1002 13:01:25.563876 4724 patch_prober.go:28] interesting pod/console-operator-58897d9998-v94bt container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.14:8443/readyz\": dial tcp 10.217.0.14:8443: connect: connection refused" start-of-body= Oct 02 13:01:25 crc kubenswrapper[4724]: I1002 13:01:25.563969 4724 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-v94bt" podUID="152155f3-933d-43c5-abeb-7c06899a6939" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.14:8443/readyz\": dial tcp 10.217.0.14:8443: connect: connection refused" Oct 02 13:01:25 crc kubenswrapper[4724]: I1002 13:01:25.574252 4724 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns-operator/dns-operator-744455d44c-4lzm2" podStartSLOduration=125.574232603 podStartE2EDuration="2m5.574232603s" podCreationTimestamp="2025-10-02 12:59:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 13:01:25.57293804 +0000 UTC m=+150.027697171" watchObservedRunningTime="2025-10-02 13:01:25.574232603 +0000 UTC m=+150.028991734" Oct 02 13:01:25 crc kubenswrapper[4724]: I1002 13:01:25.596079 4724 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-58897d9998-v94bt" podStartSLOduration=125.596061245 podStartE2EDuration="2m5.596061245s" podCreationTimestamp="2025-10-02 12:59:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 13:01:25.594005233 +0000 UTC m=+150.048764364" watchObservedRunningTime="2025-10-02 13:01:25.596061245 +0000 UTC m=+150.050820366" Oct 02 13:01:25 crc kubenswrapper[4724]: I1002 13:01:25.618910 4724 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-admission-controller-857f4d67dd-xnt99" podStartSLOduration=124.618889092 podStartE2EDuration="2m4.618889092s" podCreationTimestamp="2025-10-02 12:59:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 13:01:25.614198834 +0000 UTC m=+150.068957955" watchObservedRunningTime="2025-10-02 13:01:25.618889092 +0000 UTC m=+150.073648213" Oct 02 13:01:25 crc kubenswrapper[4724]: I1002 13:01:25.641995 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 13:01:25 crc kubenswrapper[4724]: E1002 13:01:25.644129 4724 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 13:01:26.14409926 +0000 UTC m=+150.598858391 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 13:01:25 crc kubenswrapper[4724]: I1002 13:01:25.657070 4724 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-tq9w6" podStartSLOduration=7.657050728 podStartE2EDuration="7.657050728s" podCreationTimestamp="2025-10-02 13:01:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 13:01:25.656636197 +0000 UTC m=+150.111395318" watchObservedRunningTime="2025-10-02 13:01:25.657050728 +0000 UTC m=+150.111809849" Oct 02 13:01:25 crc kubenswrapper[4724]: I1002 13:01:25.687021 4724 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-m6knf" podStartSLOduration=124.686981475 podStartE2EDuration="2m4.686981475s" podCreationTimestamp="2025-10-02 12:59:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 13:01:25.685919878 +0000 UTC m=+150.140679009" watchObservedRunningTime="2025-10-02 13:01:25.686981475 +0000 UTC m=+150.141740616" Oct 02 13:01:25 crc kubenswrapper[4724]: I1002 13:01:25.705835 4724 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-z6hlf" podStartSLOduration=124.705819222 podStartE2EDuration="2m4.705819222s" podCreationTimestamp="2025-10-02 12:59:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 13:01:25.705499994 +0000 UTC m=+150.160259125" watchObservedRunningTime="2025-10-02 13:01:25.705819222 +0000 UTC m=+150.160578343" Oct 02 13:01:25 crc kubenswrapper[4724]: I1002 13:01:25.725494 4724 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-vg57j" podStartSLOduration=124.725477279 podStartE2EDuration="2m4.725477279s" podCreationTimestamp="2025-10-02 12:59:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 13:01:25.724492324 +0000 UTC m=+150.179251445" watchObservedRunningTime="2025-10-02 13:01:25.725477279 +0000 UTC m=+150.180236400" Oct 02 13:01:25 crc kubenswrapper[4724]: I1002 13:01:25.744808 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-v5w5d\" (UID: \"7de6408f-76b4-4a9a-bf83-fe6c4e60848e\") " pod="openshift-image-registry/image-registry-697d97f7c8-v5w5d" Oct 02 13:01:25 crc kubenswrapper[4724]: E1002 13:01:25.745230 4724 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 13:01:26.245207128 +0000 UTC m=+150.699966249 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-v5w5d" (UID: "7de6408f-76b4-4a9a-bf83-fe6c4e60848e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 13:01:25 crc kubenswrapper[4724]: I1002 13:01:25.771702 4724 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-hdbqt" podStartSLOduration=125.771680198 podStartE2EDuration="2m5.771680198s" podCreationTimestamp="2025-10-02 12:59:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 13:01:25.768961619 +0000 UTC m=+150.223720750" watchObservedRunningTime="2025-10-02 13:01:25.771680198 +0000 UTC m=+150.226439319" Oct 02 13:01:25 crc kubenswrapper[4724]: I1002 13:01:25.804863 4724 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-5mxg5" podStartSLOduration=124.804830417 podStartE2EDuration="2m4.804830417s" podCreationTimestamp="2025-10-02 12:59:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 13:01:25.803706888 +0000 UTC m=+150.258466009" watchObservedRunningTime="2025-10-02 13:01:25.804830417 +0000 UTC m=+150.259589538" Oct 02 13:01:25 crc kubenswrapper[4724]: I1002 13:01:25.835504 4724 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-9c57cc56f-4rwfr" podStartSLOduration=124.835487492 podStartE2EDuration="2m4.835487492s" podCreationTimestamp="2025-10-02 12:59:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 13:01:25.834428645 +0000 UTC m=+150.289187776" watchObservedRunningTime="2025-10-02 13:01:25.835487492 +0000 UTC m=+150.290246603" Oct 02 13:01:25 crc kubenswrapper[4724]: I1002 13:01:25.846269 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 13:01:25 crc kubenswrapper[4724]: E1002 13:01:25.846487 4724 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 13:01:26.34645511 +0000 UTC m=+150.801214231 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 13:01:25 crc kubenswrapper[4724]: I1002 13:01:25.847044 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-v5w5d\" (UID: \"7de6408f-76b4-4a9a-bf83-fe6c4e60848e\") " pod="openshift-image-registry/image-registry-697d97f7c8-v5w5d" Oct 02 13:01:25 crc kubenswrapper[4724]: E1002 13:01:25.847455 4724 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 13:01:26.347443205 +0000 UTC m=+150.802202386 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-v5w5d" (UID: "7de6408f-76b4-4a9a-bf83-fe6c4e60848e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 13:01:25 crc kubenswrapper[4724]: I1002 13:01:25.856628 4724 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-f2jql" podStartSLOduration=124.856609897 podStartE2EDuration="2m4.856609897s" podCreationTimestamp="2025-10-02 12:59:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 13:01:25.854678958 +0000 UTC m=+150.309438079" watchObservedRunningTime="2025-10-02 13:01:25.856609897 +0000 UTC m=+150.311369018" Oct 02 13:01:25 crc kubenswrapper[4724]: I1002 13:01:25.884893 4724 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-w4dqz" podStartSLOduration=125.884869792 podStartE2EDuration="2m5.884869792s" podCreationTimestamp="2025-10-02 12:59:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 13:01:25.88283664 +0000 UTC m=+150.337595771" watchObservedRunningTime="2025-10-02 13:01:25.884869792 +0000 UTC m=+150.339628923" Oct 02 13:01:25 crc kubenswrapper[4724]: I1002 13:01:25.913672 4724 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-777779d784-klq6x" podStartSLOduration=124.91365372 podStartE2EDuration="2m4.91365372s" podCreationTimestamp="2025-10-02 12:59:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 13:01:25.913429044 +0000 UTC m=+150.368188185" watchObservedRunningTime="2025-10-02 13:01:25.91365372 +0000 UTC m=+150.368412841" Oct 02 13:01:25 crc kubenswrapper[4724]: I1002 13:01:25.940862 4724 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-qbbfj" podStartSLOduration=124.940841248 podStartE2EDuration="2m4.940841248s" podCreationTimestamp="2025-10-02 12:59:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 13:01:25.938520299 +0000 UTC m=+150.393279420" watchObservedRunningTime="2025-10-02 13:01:25.940841248 +0000 UTC m=+150.395600369" Oct 02 13:01:25 crc kubenswrapper[4724]: I1002 13:01:25.948169 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 13:01:25 crc kubenswrapper[4724]: E1002 13:01:25.948392 4724 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 13:01:26.448360738 +0000 UTC m=+150.903119869 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 13:01:25 crc kubenswrapper[4724]: I1002 13:01:25.948610 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-v5w5d\" (UID: \"7de6408f-76b4-4a9a-bf83-fe6c4e60848e\") " pod="openshift-image-registry/image-registry-697d97f7c8-v5w5d" Oct 02 13:01:25 crc kubenswrapper[4724]: E1002 13:01:25.948981 4724 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 13:01:26.448965483 +0000 UTC m=+150.903724664 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-v5w5d" (UID: "7de6408f-76b4-4a9a-bf83-fe6c4e60848e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 13:01:26 crc kubenswrapper[4724]: I1002 13:01:26.050159 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 13:01:26 crc kubenswrapper[4724]: E1002 13:01:26.050338 4724 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 13:01:26.550305337 +0000 UTC m=+151.005064478 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 13:01:26 crc kubenswrapper[4724]: I1002 13:01:26.050693 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-v5w5d\" (UID: \"7de6408f-76b4-4a9a-bf83-fe6c4e60848e\") " pod="openshift-image-registry/image-registry-697d97f7c8-v5w5d" Oct 02 13:01:26 crc kubenswrapper[4724]: E1002 13:01:26.051073 4724 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 13:01:26.551056906 +0000 UTC m=+151.005816027 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-v5w5d" (UID: "7de6408f-76b4-4a9a-bf83-fe6c4e60848e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 13:01:26 crc kubenswrapper[4724]: I1002 13:01:26.152012 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 13:01:26 crc kubenswrapper[4724]: E1002 13:01:26.152170 4724 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 13:01:26.652143353 +0000 UTC m=+151.106902474 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 13:01:26 crc kubenswrapper[4724]: I1002 13:01:26.152298 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-v5w5d\" (UID: \"7de6408f-76b4-4a9a-bf83-fe6c4e60848e\") " pod="openshift-image-registry/image-registry-697d97f7c8-v5w5d" Oct 02 13:01:26 crc kubenswrapper[4724]: E1002 13:01:26.152645 4724 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 13:01:26.652634246 +0000 UTC m=+151.107393367 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-v5w5d" (UID: "7de6408f-76b4-4a9a-bf83-fe6c4e60848e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 13:01:26 crc kubenswrapper[4724]: I1002 13:01:26.208448 4724 patch_prober.go:28] interesting pod/openshift-config-operator-7777fb866f-974gc container/openshift-config-operator namespace/openshift-config-operator: Liveness probe status=failure output="Get \"https://10.217.0.16:8443/healthz\": dial tcp 10.217.0.16:8443: connect: connection refused" start-of-body= Oct 02 13:01:26 crc kubenswrapper[4724]: I1002 13:01:26.208448 4724 patch_prober.go:28] interesting pod/openshift-config-operator-7777fb866f-974gc container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.217.0.16:8443/healthz\": dial tcp 10.217.0.16:8443: connect: connection refused" start-of-body= Oct 02 13:01:26 crc kubenswrapper[4724]: I1002 13:01:26.208507 4724 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-config-operator/openshift-config-operator-7777fb866f-974gc" podUID="4f4b9ef0-2fa1-4d48-81c9-e428e93c7034" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.217.0.16:8443/healthz\": dial tcp 10.217.0.16:8443: connect: connection refused" Oct 02 13:01:26 crc kubenswrapper[4724]: I1002 13:01:26.208592 4724 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-7777fb866f-974gc" podUID="4f4b9ef0-2fa1-4d48-81c9-e428e93c7034" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.217.0.16:8443/healthz\": dial tcp 10.217.0.16:8443: connect: connection refused" Oct 02 13:01:26 crc kubenswrapper[4724]: I1002 13:01:26.253322 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 13:01:26 crc kubenswrapper[4724]: E1002 13:01:26.253461 4724 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 13:01:26.753438396 +0000 UTC m=+151.208197527 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 13:01:26 crc kubenswrapper[4724]: I1002 13:01:26.253919 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-v5w5d\" (UID: \"7de6408f-76b4-4a9a-bf83-fe6c4e60848e\") " pod="openshift-image-registry/image-registry-697d97f7c8-v5w5d" Oct 02 13:01:26 crc kubenswrapper[4724]: E1002 13:01:26.254241 4724 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 13:01:26.754232206 +0000 UTC m=+151.208991327 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-v5w5d" (UID: "7de6408f-76b4-4a9a-bf83-fe6c4e60848e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 13:01:26 crc kubenswrapper[4724]: I1002 13:01:26.355396 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 13:01:26 crc kubenswrapper[4724]: E1002 13:01:26.355581 4724 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 13:01:26.85555996 +0000 UTC m=+151.310319081 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 13:01:26 crc kubenswrapper[4724]: I1002 13:01:26.361788 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-v5w5d\" (UID: \"7de6408f-76b4-4a9a-bf83-fe6c4e60848e\") " pod="openshift-image-registry/image-registry-697d97f7c8-v5w5d" Oct 02 13:01:26 crc kubenswrapper[4724]: E1002 13:01:26.362607 4724 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 13:01:26.862590858 +0000 UTC m=+151.317349979 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-v5w5d" (UID: "7de6408f-76b4-4a9a-bf83-fe6c4e60848e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 13:01:26 crc kubenswrapper[4724]: I1002 13:01:26.463510 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 13:01:26 crc kubenswrapper[4724]: E1002 13:01:26.463641 4724 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 13:01:26.963618564 +0000 UTC m=+151.418377685 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 13:01:26 crc kubenswrapper[4724]: I1002 13:01:26.464661 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-v5w5d\" (UID: \"7de6408f-76b4-4a9a-bf83-fe6c4e60848e\") " pod="openshift-image-registry/image-registry-697d97f7c8-v5w5d" Oct 02 13:01:26 crc kubenswrapper[4724]: E1002 13:01:26.465059 4724 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 13:01:26.96504366 +0000 UTC m=+151.419802791 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-v5w5d" (UID: "7de6408f-76b4-4a9a-bf83-fe6c4e60848e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 13:01:26 crc kubenswrapper[4724]: I1002 13:01:26.551305 4724 patch_prober.go:28] interesting pod/router-default-5444994796-n5rln container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 02 13:01:26 crc kubenswrapper[4724]: [-]has-synced failed: reason withheld Oct 02 13:01:26 crc kubenswrapper[4724]: [+]process-running ok Oct 02 13:01:26 crc kubenswrapper[4724]: healthz check failed Oct 02 13:01:26 crc kubenswrapper[4724]: I1002 13:01:26.551387 4724 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-n5rln" podUID="6578bd1a-eaad-452a-adf5-7f3e34838677" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 02 13:01:26 crc kubenswrapper[4724]: I1002 13:01:26.566319 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 13:01:26 crc kubenswrapper[4724]: E1002 13:01:26.566646 4724 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 13:01:27.06661516 +0000 UTC m=+151.521374281 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 13:01:26 crc kubenswrapper[4724]: I1002 13:01:26.568443 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-cwvv5" event={"ID":"c8812b3c-bd75-49a8-b2a7-3db91675fc09","Type":"ContainerStarted","Data":"04b5f44cd4e20728a942c72e4ab20518584c0f0c88483ade4816e58df3ecfb86"} Oct 02 13:01:26 crc kubenswrapper[4724]: I1002 13:01:26.573705 4724 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-j7cp6 container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.33:8080/healthz\": dial tcp 10.217.0.33:8080: connect: connection refused" start-of-body= Oct 02 13:01:26 crc kubenswrapper[4724]: I1002 13:01:26.573767 4724 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-j7cp6" podUID="98449ccf-cf29-44ab-9400-994b04309bb5" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.33:8080/healthz\": dial tcp 10.217.0.33:8080: connect: connection refused" Oct 02 13:01:26 crc kubenswrapper[4724]: I1002 13:01:26.573852 4724 patch_prober.go:28] interesting pod/console-operator-58897d9998-v94bt container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.14:8443/readyz\": dial tcp 10.217.0.14:8443: connect: connection refused" start-of-body= Oct 02 13:01:26 crc kubenswrapper[4724]: I1002 13:01:26.573870 4724 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-v94bt" podUID="152155f3-933d-43c5-abeb-7c06899a6939" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.14:8443/readyz\": dial tcp 10.217.0.14:8443: connect: connection refused" Oct 02 13:01:26 crc kubenswrapper[4724]: I1002 13:01:26.668367 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-v5w5d\" (UID: \"7de6408f-76b4-4a9a-bf83-fe6c4e60848e\") " pod="openshift-image-registry/image-registry-697d97f7c8-v5w5d" Oct 02 13:01:26 crc kubenswrapper[4724]: E1002 13:01:26.669500 4724 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 13:01:27.169485782 +0000 UTC m=+151.624244973 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-v5w5d" (UID: "7de6408f-76b4-4a9a-bf83-fe6c4e60848e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 13:01:26 crc kubenswrapper[4724]: I1002 13:01:26.769332 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 13:01:26 crc kubenswrapper[4724]: E1002 13:01:26.769722 4724 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 13:01:27.269701007 +0000 UTC m=+151.724460128 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 13:01:26 crc kubenswrapper[4724]: I1002 13:01:26.870669 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-v5w5d\" (UID: \"7de6408f-76b4-4a9a-bf83-fe6c4e60848e\") " pod="openshift-image-registry/image-registry-697d97f7c8-v5w5d" Oct 02 13:01:26 crc kubenswrapper[4724]: E1002 13:01:26.871160 4724 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 13:01:27.371143934 +0000 UTC m=+151.825903055 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-v5w5d" (UID: "7de6408f-76b4-4a9a-bf83-fe6c4e60848e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 13:01:26 crc kubenswrapper[4724]: I1002 13:01:26.971319 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 13:01:26 crc kubenswrapper[4724]: E1002 13:01:26.971682 4724 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 13:01:27.471666877 +0000 UTC m=+151.926425998 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 13:01:27 crc kubenswrapper[4724]: I1002 13:01:27.073496 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-v5w5d\" (UID: \"7de6408f-76b4-4a9a-bf83-fe6c4e60848e\") " pod="openshift-image-registry/image-registry-697d97f7c8-v5w5d" Oct 02 13:01:27 crc kubenswrapper[4724]: E1002 13:01:27.074021 4724 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 13:01:27.573999326 +0000 UTC m=+152.028758447 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-v5w5d" (UID: "7de6408f-76b4-4a9a-bf83-fe6c4e60848e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 13:01:27 crc kubenswrapper[4724]: I1002 13:01:27.175864 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 13:01:27 crc kubenswrapper[4724]: E1002 13:01:27.176078 4724 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 13:01:27.676026817 +0000 UTC m=+152.130785938 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 13:01:27 crc kubenswrapper[4724]: I1002 13:01:27.176334 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 13:01:27 crc kubenswrapper[4724]: I1002 13:01:27.176372 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-v5w5d\" (UID: \"7de6408f-76b4-4a9a-bf83-fe6c4e60848e\") " pod="openshift-image-registry/image-registry-697d97f7c8-v5w5d" Oct 02 13:01:27 crc kubenswrapper[4724]: E1002 13:01:27.176652 4724 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 13:01:27.676644943 +0000 UTC m=+152.131404054 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-v5w5d" (UID: "7de6408f-76b4-4a9a-bf83-fe6c4e60848e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 13:01:27 crc kubenswrapper[4724]: I1002 13:01:27.177580 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 13:01:27 crc kubenswrapper[4724]: I1002 13:01:27.277152 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 13:01:27 crc kubenswrapper[4724]: I1002 13:01:27.277339 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 13:01:27 crc kubenswrapper[4724]: I1002 13:01:27.277431 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 13:01:27 crc kubenswrapper[4724]: E1002 13:01:27.278062 4724 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 13:01:27.778039138 +0000 UTC m=+152.232798259 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 13:01:27 crc kubenswrapper[4724]: I1002 13:01:27.278130 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 13:01:27 crc kubenswrapper[4724]: I1002 13:01:27.287248 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 13:01:27 crc kubenswrapper[4724]: I1002 13:01:27.287301 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 13:01:27 crc kubenswrapper[4724]: I1002 13:01:27.287817 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 13:01:27 crc kubenswrapper[4724]: I1002 13:01:27.379872 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-v5w5d\" (UID: \"7de6408f-76b4-4a9a-bf83-fe6c4e60848e\") " pod="openshift-image-registry/image-registry-697d97f7c8-v5w5d" Oct 02 13:01:27 crc kubenswrapper[4724]: E1002 13:01:27.380280 4724 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 13:01:27.880265175 +0000 UTC m=+152.335024296 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-v5w5d" (UID: "7de6408f-76b4-4a9a-bf83-fe6c4e60848e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 13:01:27 crc kubenswrapper[4724]: I1002 13:01:27.436203 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 13:01:27 crc kubenswrapper[4724]: I1002 13:01:27.444429 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 13:01:27 crc kubenswrapper[4724]: I1002 13:01:27.481459 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 13:01:27 crc kubenswrapper[4724]: E1002 13:01:27.481850 4724 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 13:01:27.981830844 +0000 UTC m=+152.436589965 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 13:01:27 crc kubenswrapper[4724]: I1002 13:01:27.493416 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 13:01:27 crc kubenswrapper[4724]: I1002 13:01:27.547324 4724 patch_prober.go:28] interesting pod/router-default-5444994796-n5rln container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 02 13:01:27 crc kubenswrapper[4724]: [-]has-synced failed: reason withheld Oct 02 13:01:27 crc kubenswrapper[4724]: [+]process-running ok Oct 02 13:01:27 crc kubenswrapper[4724]: healthz check failed Oct 02 13:01:27 crc kubenswrapper[4724]: I1002 13:01:27.547587 4724 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-n5rln" podUID="6578bd1a-eaad-452a-adf5-7f3e34838677" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 02 13:01:27 crc kubenswrapper[4724]: I1002 13:01:27.582706 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-v5w5d\" (UID: \"7de6408f-76b4-4a9a-bf83-fe6c4e60848e\") " pod="openshift-image-registry/image-registry-697d97f7c8-v5w5d" Oct 02 13:01:27 crc kubenswrapper[4724]: E1002 13:01:27.583023 4724 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 13:01:28.083006854 +0000 UTC m=+152.537765985 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-v5w5d" (UID: "7de6408f-76b4-4a9a-bf83-fe6c4e60848e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 13:01:27 crc kubenswrapper[4724]: I1002 13:01:27.591536 4724 patch_prober.go:28] interesting pod/console-operator-58897d9998-v94bt container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.14:8443/readyz\": dial tcp 10.217.0.14:8443: connect: connection refused" start-of-body= Oct 02 13:01:27 crc kubenswrapper[4724]: I1002 13:01:27.591604 4724 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-v94bt" podUID="152155f3-933d-43c5-abeb-7c06899a6939" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.14:8443/readyz\": dial tcp 10.217.0.14:8443: connect: connection refused" Oct 02 13:01:27 crc kubenswrapper[4724]: I1002 13:01:27.684875 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 13:01:27 crc kubenswrapper[4724]: E1002 13:01:27.686461 4724 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 13:01:28.186440881 +0000 UTC m=+152.641200012 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 13:01:27 crc kubenswrapper[4724]: I1002 13:01:27.786234 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-v5w5d\" (UID: \"7de6408f-76b4-4a9a-bf83-fe6c4e60848e\") " pod="openshift-image-registry/image-registry-697d97f7c8-v5w5d" Oct 02 13:01:27 crc kubenswrapper[4724]: E1002 13:01:27.786644 4724 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 13:01:28.286623165 +0000 UTC m=+152.741382336 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-v5w5d" (UID: "7de6408f-76b4-4a9a-bf83-fe6c4e60848e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 13:01:27 crc kubenswrapper[4724]: I1002 13:01:27.887697 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 13:01:27 crc kubenswrapper[4724]: E1002 13:01:27.887904 4724 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 13:01:28.387871537 +0000 UTC m=+152.842630658 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 13:01:27 crc kubenswrapper[4724]: I1002 13:01:27.888273 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-v5w5d\" (UID: \"7de6408f-76b4-4a9a-bf83-fe6c4e60848e\") " pod="openshift-image-registry/image-registry-697d97f7c8-v5w5d" Oct 02 13:01:27 crc kubenswrapper[4724]: E1002 13:01:27.888589 4724 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 13:01:28.388581325 +0000 UTC m=+152.843340446 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-v5w5d" (UID: "7de6408f-76b4-4a9a-bf83-fe6c4e60848e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 13:01:27 crc kubenswrapper[4724]: I1002 13:01:27.988986 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 13:01:27 crc kubenswrapper[4724]: E1002 13:01:27.989359 4724 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 13:01:28.489341554 +0000 UTC m=+152.944100675 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 13:01:28 crc kubenswrapper[4724]: W1002 13:01:28.032152 4724 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5fe485a1_e14f_4c09_b5b9_f252bc42b7e8.slice/crio-cb3ef3b6c5e41e7c986780f867bc0321e680101f64d4fd14a1d7c9d913999ae4 WatchSource:0}: Error finding container cb3ef3b6c5e41e7c986780f867bc0321e680101f64d4fd14a1d7c9d913999ae4: Status 404 returned error can't find the container with id cb3ef3b6c5e41e7c986780f867bc0321e680101f64d4fd14a1d7c9d913999ae4 Oct 02 13:01:28 crc kubenswrapper[4724]: I1002 13:01:28.094747 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-v5w5d\" (UID: \"7de6408f-76b4-4a9a-bf83-fe6c4e60848e\") " pod="openshift-image-registry/image-registry-697d97f7c8-v5w5d" Oct 02 13:01:28 crc kubenswrapper[4724]: E1002 13:01:28.095133 4724 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 13:01:28.59511809 +0000 UTC m=+153.049877211 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-v5w5d" (UID: "7de6408f-76b4-4a9a-bf83-fe6c4e60848e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 13:01:28 crc kubenswrapper[4724]: I1002 13:01:28.097356 4724 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-x7mvk"] Oct 02 13:01:28 crc kubenswrapper[4724]: I1002 13:01:28.098546 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-x7mvk" Oct 02 13:01:28 crc kubenswrapper[4724]: I1002 13:01:28.120595 4724 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Oct 02 13:01:28 crc kubenswrapper[4724]: I1002 13:01:28.134898 4724 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-x7mvk"] Oct 02 13:01:28 crc kubenswrapper[4724]: I1002 13:01:28.198504 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 13:01:28 crc kubenswrapper[4724]: I1002 13:01:28.198944 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-whnfx\" (UniqueName: \"kubernetes.io/projected/d1808303-74a2-424b-9dd4-64838d28a1c7-kube-api-access-whnfx\") pod \"community-operators-x7mvk\" (UID: \"d1808303-74a2-424b-9dd4-64838d28a1c7\") " pod="openshift-marketplace/community-operators-x7mvk" Oct 02 13:01:28 crc kubenswrapper[4724]: I1002 13:01:28.198993 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d1808303-74a2-424b-9dd4-64838d28a1c7-utilities\") pod \"community-operators-x7mvk\" (UID: \"d1808303-74a2-424b-9dd4-64838d28a1c7\") " pod="openshift-marketplace/community-operators-x7mvk" Oct 02 13:01:28 crc kubenswrapper[4724]: E1002 13:01:28.199027 4724 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 13:01:28.698997288 +0000 UTC m=+153.153756409 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 13:01:28 crc kubenswrapper[4724]: I1002 13:01:28.199071 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d1808303-74a2-424b-9dd4-64838d28a1c7-catalog-content\") pod \"community-operators-x7mvk\" (UID: \"d1808303-74a2-424b-9dd4-64838d28a1c7\") " pod="openshift-marketplace/community-operators-x7mvk" Oct 02 13:01:28 crc kubenswrapper[4724]: I1002 13:01:28.199134 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-v5w5d\" (UID: \"7de6408f-76b4-4a9a-bf83-fe6c4e60848e\") " pod="openshift-image-registry/image-registry-697d97f7c8-v5w5d" Oct 02 13:01:28 crc kubenswrapper[4724]: E1002 13:01:28.199448 4724 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 13:01:28.699428899 +0000 UTC m=+153.154188020 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-v5w5d" (UID: "7de6408f-76b4-4a9a-bf83-fe6c4e60848e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 13:01:28 crc kubenswrapper[4724]: I1002 13:01:28.287035 4724 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-vqs8m"] Oct 02 13:01:28 crc kubenswrapper[4724]: I1002 13:01:28.288055 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-vqs8m" Oct 02 13:01:28 crc kubenswrapper[4724]: I1002 13:01:28.290517 4724 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Oct 02 13:01:28 crc kubenswrapper[4724]: I1002 13:01:28.301382 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 13:01:28 crc kubenswrapper[4724]: I1002 13:01:28.301654 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-whnfx\" (UniqueName: \"kubernetes.io/projected/d1808303-74a2-424b-9dd4-64838d28a1c7-kube-api-access-whnfx\") pod \"community-operators-x7mvk\" (UID: \"d1808303-74a2-424b-9dd4-64838d28a1c7\") " pod="openshift-marketplace/community-operators-x7mvk" Oct 02 13:01:28 crc kubenswrapper[4724]: I1002 13:01:28.301677 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d1808303-74a2-424b-9dd4-64838d28a1c7-utilities\") pod \"community-operators-x7mvk\" (UID: \"d1808303-74a2-424b-9dd4-64838d28a1c7\") " pod="openshift-marketplace/community-operators-x7mvk" Oct 02 13:01:28 crc kubenswrapper[4724]: I1002 13:01:28.301711 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d1808303-74a2-424b-9dd4-64838d28a1c7-catalog-content\") pod \"community-operators-x7mvk\" (UID: \"d1808303-74a2-424b-9dd4-64838d28a1c7\") " pod="openshift-marketplace/community-operators-x7mvk" Oct 02 13:01:28 crc kubenswrapper[4724]: I1002 13:01:28.302065 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d1808303-74a2-424b-9dd4-64838d28a1c7-catalog-content\") pod \"community-operators-x7mvk\" (UID: \"d1808303-74a2-424b-9dd4-64838d28a1c7\") " pod="openshift-marketplace/community-operators-x7mvk" Oct 02 13:01:28 crc kubenswrapper[4724]: E1002 13:01:28.302138 4724 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 13:01:28.802122167 +0000 UTC m=+153.256881288 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 13:01:28 crc kubenswrapper[4724]: I1002 13:01:28.302610 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d1808303-74a2-424b-9dd4-64838d28a1c7-utilities\") pod \"community-operators-x7mvk\" (UID: \"d1808303-74a2-424b-9dd4-64838d28a1c7\") " pod="openshift-marketplace/community-operators-x7mvk" Oct 02 13:01:28 crc kubenswrapper[4724]: I1002 13:01:28.349845 4724 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-vqs8m"] Oct 02 13:01:28 crc kubenswrapper[4724]: I1002 13:01:28.405691 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/53ecbc18-3d93-4ee9-bd02-e3e99db2a82d-catalog-content\") pod \"certified-operators-vqs8m\" (UID: \"53ecbc18-3d93-4ee9-bd02-e3e99db2a82d\") " pod="openshift-marketplace/certified-operators-vqs8m" Oct 02 13:01:28 crc kubenswrapper[4724]: I1002 13:01:28.405742 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/53ecbc18-3d93-4ee9-bd02-e3e99db2a82d-utilities\") pod \"certified-operators-vqs8m\" (UID: \"53ecbc18-3d93-4ee9-bd02-e3e99db2a82d\") " pod="openshift-marketplace/certified-operators-vqs8m" Oct 02 13:01:28 crc kubenswrapper[4724]: I1002 13:01:28.405803 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bn79v\" (UniqueName: \"kubernetes.io/projected/53ecbc18-3d93-4ee9-bd02-e3e99db2a82d-kube-api-access-bn79v\") pod \"certified-operators-vqs8m\" (UID: \"53ecbc18-3d93-4ee9-bd02-e3e99db2a82d\") " pod="openshift-marketplace/certified-operators-vqs8m" Oct 02 13:01:28 crc kubenswrapper[4724]: I1002 13:01:28.405839 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-v5w5d\" (UID: \"7de6408f-76b4-4a9a-bf83-fe6c4e60848e\") " pod="openshift-image-registry/image-registry-697d97f7c8-v5w5d" Oct 02 13:01:28 crc kubenswrapper[4724]: E1002 13:01:28.406146 4724 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 13:01:28.906130438 +0000 UTC m=+153.360889559 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-v5w5d" (UID: "7de6408f-76b4-4a9a-bf83-fe6c4e60848e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 13:01:28 crc kubenswrapper[4724]: I1002 13:01:28.409830 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-whnfx\" (UniqueName: \"kubernetes.io/projected/d1808303-74a2-424b-9dd4-64838d28a1c7-kube-api-access-whnfx\") pod \"community-operators-x7mvk\" (UID: \"d1808303-74a2-424b-9dd4-64838d28a1c7\") " pod="openshift-marketplace/community-operators-x7mvk" Oct 02 13:01:28 crc kubenswrapper[4724]: I1002 13:01:28.469846 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-x7mvk" Oct 02 13:01:28 crc kubenswrapper[4724]: I1002 13:01:28.506019 4724 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-jvpvx"] Oct 02 13:01:28 crc kubenswrapper[4724]: I1002 13:01:28.506502 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 13:01:28 crc kubenswrapper[4724]: I1002 13:01:28.506740 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/53ecbc18-3d93-4ee9-bd02-e3e99db2a82d-utilities\") pod \"certified-operators-vqs8m\" (UID: \"53ecbc18-3d93-4ee9-bd02-e3e99db2a82d\") " pod="openshift-marketplace/certified-operators-vqs8m" Oct 02 13:01:28 crc kubenswrapper[4724]: I1002 13:01:28.506830 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bn79v\" (UniqueName: \"kubernetes.io/projected/53ecbc18-3d93-4ee9-bd02-e3e99db2a82d-kube-api-access-bn79v\") pod \"certified-operators-vqs8m\" (UID: \"53ecbc18-3d93-4ee9-bd02-e3e99db2a82d\") " pod="openshift-marketplace/certified-operators-vqs8m" Oct 02 13:01:28 crc kubenswrapper[4724]: I1002 13:01:28.506904 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/53ecbc18-3d93-4ee9-bd02-e3e99db2a82d-catalog-content\") pod \"certified-operators-vqs8m\" (UID: \"53ecbc18-3d93-4ee9-bd02-e3e99db2a82d\") " pod="openshift-marketplace/certified-operators-vqs8m" Oct 02 13:01:28 crc kubenswrapper[4724]: I1002 13:01:28.507402 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-jvpvx" Oct 02 13:01:28 crc kubenswrapper[4724]: I1002 13:01:28.507441 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/53ecbc18-3d93-4ee9-bd02-e3e99db2a82d-catalog-content\") pod \"certified-operators-vqs8m\" (UID: \"53ecbc18-3d93-4ee9-bd02-e3e99db2a82d\") " pod="openshift-marketplace/certified-operators-vqs8m" Oct 02 13:01:28 crc kubenswrapper[4724]: E1002 13:01:28.507651 4724 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 13:01:29.007630946 +0000 UTC m=+153.462390067 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 13:01:28 crc kubenswrapper[4724]: I1002 13:01:28.508136 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/53ecbc18-3d93-4ee9-bd02-e3e99db2a82d-utilities\") pod \"certified-operators-vqs8m\" (UID: \"53ecbc18-3d93-4ee9-bd02-e3e99db2a82d\") " pod="openshift-marketplace/certified-operators-vqs8m" Oct 02 13:01:28 crc kubenswrapper[4724]: I1002 13:01:28.534492 4724 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-jvpvx"] Oct 02 13:01:28 crc kubenswrapper[4724]: I1002 13:01:28.551678 4724 patch_prober.go:28] interesting pod/router-default-5444994796-n5rln container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 02 13:01:28 crc kubenswrapper[4724]: [-]has-synced failed: reason withheld Oct 02 13:01:28 crc kubenswrapper[4724]: [+]process-running ok Oct 02 13:01:28 crc kubenswrapper[4724]: healthz check failed Oct 02 13:01:28 crc kubenswrapper[4724]: I1002 13:01:28.551727 4724 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-n5rln" podUID="6578bd1a-eaad-452a-adf5-7f3e34838677" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 02 13:01:28 crc kubenswrapper[4724]: I1002 13:01:28.582241 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bn79v\" (UniqueName: \"kubernetes.io/projected/53ecbc18-3d93-4ee9-bd02-e3e99db2a82d-kube-api-access-bn79v\") pod \"certified-operators-vqs8m\" (UID: \"53ecbc18-3d93-4ee9-bd02-e3e99db2a82d\") " pod="openshift-marketplace/certified-operators-vqs8m" Oct 02 13:01:28 crc kubenswrapper[4724]: I1002 13:01:28.608266 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-v5w5d\" (UID: \"7de6408f-76b4-4a9a-bf83-fe6c4e60848e\") " pod="openshift-image-registry/image-registry-697d97f7c8-v5w5d" Oct 02 13:01:28 crc kubenswrapper[4724]: I1002 13:01:28.608330 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a4fcd146-2be5-4691-a6db-4e9ed60b4711-catalog-content\") pod \"community-operators-jvpvx\" (UID: \"a4fcd146-2be5-4691-a6db-4e9ed60b4711\") " pod="openshift-marketplace/community-operators-jvpvx" Oct 02 13:01:28 crc kubenswrapper[4724]: I1002 13:01:28.608352 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2jzkh\" (UniqueName: \"kubernetes.io/projected/a4fcd146-2be5-4691-a6db-4e9ed60b4711-kube-api-access-2jzkh\") pod \"community-operators-jvpvx\" (UID: \"a4fcd146-2be5-4691-a6db-4e9ed60b4711\") " pod="openshift-marketplace/community-operators-jvpvx" Oct 02 13:01:28 crc kubenswrapper[4724]: I1002 13:01:28.608369 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a4fcd146-2be5-4691-a6db-4e9ed60b4711-utilities\") pod \"community-operators-jvpvx\" (UID: \"a4fcd146-2be5-4691-a6db-4e9ed60b4711\") " pod="openshift-marketplace/community-operators-jvpvx" Oct 02 13:01:28 crc kubenswrapper[4724]: E1002 13:01:28.608681 4724 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 13:01:29.108664892 +0000 UTC m=+153.563424023 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-v5w5d" (UID: "7de6408f-76b4-4a9a-bf83-fe6c4e60848e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 13:01:28 crc kubenswrapper[4724]: I1002 13:01:28.615164 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"af48c129f140534342b52e9f6c48b872bd064fa0b3343c7a9ce7fb6e37b24fee"} Oct 02 13:01:28 crc kubenswrapper[4724]: I1002 13:01:28.615221 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"cb3ef3b6c5e41e7c986780f867bc0321e680101f64d4fd14a1d7c9d913999ae4"} Oct 02 13:01:28 crc kubenswrapper[4724]: I1002 13:01:28.628529 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"71d04d45d20c8ead3c5fe9e9cbb0f29ec38f2d740af9c5a58ec8f4fa2cd7d6e4"} Oct 02 13:01:28 crc kubenswrapper[4724]: I1002 13:01:28.628581 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"faf83c5cc874363ffba01832e7a0941593376870774b779a7e4ccf6f423e5004"} Oct 02 13:01:28 crc kubenswrapper[4724]: I1002 13:01:28.643231 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"e9fb4f4ee7e08af6848f53dafceacb1381b3f7e5bae08386bcb52524a5c778b7"} Oct 02 13:01:28 crc kubenswrapper[4724]: I1002 13:01:28.675356 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-vqs8m" Oct 02 13:01:28 crc kubenswrapper[4724]: I1002 13:01:28.709226 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 13:01:28 crc kubenswrapper[4724]: I1002 13:01:28.709566 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a4fcd146-2be5-4691-a6db-4e9ed60b4711-catalog-content\") pod \"community-operators-jvpvx\" (UID: \"a4fcd146-2be5-4691-a6db-4e9ed60b4711\") " pod="openshift-marketplace/community-operators-jvpvx" Oct 02 13:01:28 crc kubenswrapper[4724]: I1002 13:01:28.709592 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2jzkh\" (UniqueName: \"kubernetes.io/projected/a4fcd146-2be5-4691-a6db-4e9ed60b4711-kube-api-access-2jzkh\") pod \"community-operators-jvpvx\" (UID: \"a4fcd146-2be5-4691-a6db-4e9ed60b4711\") " pod="openshift-marketplace/community-operators-jvpvx" Oct 02 13:01:28 crc kubenswrapper[4724]: I1002 13:01:28.709611 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a4fcd146-2be5-4691-a6db-4e9ed60b4711-utilities\") pod \"community-operators-jvpvx\" (UID: \"a4fcd146-2be5-4691-a6db-4e9ed60b4711\") " pod="openshift-marketplace/community-operators-jvpvx" Oct 02 13:01:28 crc kubenswrapper[4724]: I1002 13:01:28.710059 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a4fcd146-2be5-4691-a6db-4e9ed60b4711-utilities\") pod \"community-operators-jvpvx\" (UID: \"a4fcd146-2be5-4691-a6db-4e9ed60b4711\") " pod="openshift-marketplace/community-operators-jvpvx" Oct 02 13:01:28 crc kubenswrapper[4724]: E1002 13:01:28.710125 4724 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 13:01:29.210109738 +0000 UTC m=+153.664868859 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 13:01:28 crc kubenswrapper[4724]: I1002 13:01:28.710344 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a4fcd146-2be5-4691-a6db-4e9ed60b4711-catalog-content\") pod \"community-operators-jvpvx\" (UID: \"a4fcd146-2be5-4691-a6db-4e9ed60b4711\") " pod="openshift-marketplace/community-operators-jvpvx" Oct 02 13:01:28 crc kubenswrapper[4724]: I1002 13:01:28.750491 4724 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-vbrhf"] Oct 02 13:01:28 crc kubenswrapper[4724]: I1002 13:01:28.751827 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-vbrhf" Oct 02 13:01:28 crc kubenswrapper[4724]: I1002 13:01:28.812589 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-v5w5d\" (UID: \"7de6408f-76b4-4a9a-bf83-fe6c4e60848e\") " pod="openshift-image-registry/image-registry-697d97f7c8-v5w5d" Oct 02 13:01:28 crc kubenswrapper[4724]: I1002 13:01:28.812676 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-glm8h\" (UniqueName: \"kubernetes.io/projected/d3ea65d9-9080-4ccb-837c-ed218fce942c-kube-api-access-glm8h\") pod \"certified-operators-vbrhf\" (UID: \"d3ea65d9-9080-4ccb-837c-ed218fce942c\") " pod="openshift-marketplace/certified-operators-vbrhf" Oct 02 13:01:28 crc kubenswrapper[4724]: I1002 13:01:28.812726 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d3ea65d9-9080-4ccb-837c-ed218fce942c-utilities\") pod \"certified-operators-vbrhf\" (UID: \"d3ea65d9-9080-4ccb-837c-ed218fce942c\") " pod="openshift-marketplace/certified-operators-vbrhf" Oct 02 13:01:28 crc kubenswrapper[4724]: I1002 13:01:28.812756 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d3ea65d9-9080-4ccb-837c-ed218fce942c-catalog-content\") pod \"certified-operators-vbrhf\" (UID: \"d3ea65d9-9080-4ccb-837c-ed218fce942c\") " pod="openshift-marketplace/certified-operators-vbrhf" Oct 02 13:01:28 crc kubenswrapper[4724]: E1002 13:01:28.813034 4724 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 13:01:29.313022282 +0000 UTC m=+153.767781403 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-v5w5d" (UID: "7de6408f-76b4-4a9a-bf83-fe6c4e60848e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 13:01:28 crc kubenswrapper[4724]: I1002 13:01:28.814311 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2jzkh\" (UniqueName: \"kubernetes.io/projected/a4fcd146-2be5-4691-a6db-4e9ed60b4711-kube-api-access-2jzkh\") pod \"community-operators-jvpvx\" (UID: \"a4fcd146-2be5-4691-a6db-4e9ed60b4711\") " pod="openshift-marketplace/community-operators-jvpvx" Oct 02 13:01:28 crc kubenswrapper[4724]: I1002 13:01:28.816613 4724 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-vbrhf"] Oct 02 13:01:28 crc kubenswrapper[4724]: I1002 13:01:28.839008 4724 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Oct 02 13:01:28 crc kubenswrapper[4724]: I1002 13:01:28.839786 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 02 13:01:28 crc kubenswrapper[4724]: I1002 13:01:28.842819 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-jvpvx" Oct 02 13:01:28 crc kubenswrapper[4724]: I1002 13:01:28.843222 4724 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager"/"installer-sa-dockercfg-kjl2n" Oct 02 13:01:28 crc kubenswrapper[4724]: I1002 13:01:28.843510 4724 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager"/"kube-root-ca.crt" Oct 02 13:01:28 crc kubenswrapper[4724]: I1002 13:01:28.862458 4724 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Oct 02 13:01:28 crc kubenswrapper[4724]: I1002 13:01:28.925881 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 13:01:28 crc kubenswrapper[4724]: I1002 13:01:28.926086 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/75ffdd90-5d3e-419b-a17d-5ffced74428b-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"75ffdd90-5d3e-419b-a17d-5ffced74428b\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 02 13:01:28 crc kubenswrapper[4724]: I1002 13:01:28.926110 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-glm8h\" (UniqueName: \"kubernetes.io/projected/d3ea65d9-9080-4ccb-837c-ed218fce942c-kube-api-access-glm8h\") pod \"certified-operators-vbrhf\" (UID: \"d3ea65d9-9080-4ccb-837c-ed218fce942c\") " pod="openshift-marketplace/certified-operators-vbrhf" Oct 02 13:01:28 crc kubenswrapper[4724]: I1002 13:01:28.926146 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d3ea65d9-9080-4ccb-837c-ed218fce942c-utilities\") pod \"certified-operators-vbrhf\" (UID: \"d3ea65d9-9080-4ccb-837c-ed218fce942c\") " pod="openshift-marketplace/certified-operators-vbrhf" Oct 02 13:01:28 crc kubenswrapper[4724]: I1002 13:01:28.926170 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d3ea65d9-9080-4ccb-837c-ed218fce942c-catalog-content\") pod \"certified-operators-vbrhf\" (UID: \"d3ea65d9-9080-4ccb-837c-ed218fce942c\") " pod="openshift-marketplace/certified-operators-vbrhf" Oct 02 13:01:28 crc kubenswrapper[4724]: I1002 13:01:28.926191 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/75ffdd90-5d3e-419b-a17d-5ffced74428b-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"75ffdd90-5d3e-419b-a17d-5ffced74428b\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 02 13:01:28 crc kubenswrapper[4724]: I1002 13:01:28.927261 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d3ea65d9-9080-4ccb-837c-ed218fce942c-utilities\") pod \"certified-operators-vbrhf\" (UID: \"d3ea65d9-9080-4ccb-837c-ed218fce942c\") " pod="openshift-marketplace/certified-operators-vbrhf" Oct 02 13:01:28 crc kubenswrapper[4724]: I1002 13:01:28.927705 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d3ea65d9-9080-4ccb-837c-ed218fce942c-catalog-content\") pod \"certified-operators-vbrhf\" (UID: \"d3ea65d9-9080-4ccb-837c-ed218fce942c\") " pod="openshift-marketplace/certified-operators-vbrhf" Oct 02 13:01:28 crc kubenswrapper[4724]: E1002 13:01:28.933701 4724 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 13:01:29.433667134 +0000 UTC m=+153.888426345 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 13:01:28 crc kubenswrapper[4724]: I1002 13:01:28.961907 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-glm8h\" (UniqueName: \"kubernetes.io/projected/d3ea65d9-9080-4ccb-837c-ed218fce942c-kube-api-access-glm8h\") pod \"certified-operators-vbrhf\" (UID: \"d3ea65d9-9080-4ccb-837c-ed218fce942c\") " pod="openshift-marketplace/certified-operators-vbrhf" Oct 02 13:01:29 crc kubenswrapper[4724]: I1002 13:01:29.027774 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/75ffdd90-5d3e-419b-a17d-5ffced74428b-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"75ffdd90-5d3e-419b-a17d-5ffced74428b\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 02 13:01:29 crc kubenswrapper[4724]: I1002 13:01:29.027848 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/75ffdd90-5d3e-419b-a17d-5ffced74428b-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"75ffdd90-5d3e-419b-a17d-5ffced74428b\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 02 13:01:29 crc kubenswrapper[4724]: I1002 13:01:29.027888 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-v5w5d\" (UID: \"7de6408f-76b4-4a9a-bf83-fe6c4e60848e\") " pod="openshift-image-registry/image-registry-697d97f7c8-v5w5d" Oct 02 13:01:29 crc kubenswrapper[4724]: E1002 13:01:29.028191 4724 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 13:01:29.528178985 +0000 UTC m=+153.982938096 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-v5w5d" (UID: "7de6408f-76b4-4a9a-bf83-fe6c4e60848e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 13:01:29 crc kubenswrapper[4724]: I1002 13:01:29.028907 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/75ffdd90-5d3e-419b-a17d-5ffced74428b-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"75ffdd90-5d3e-419b-a17d-5ffced74428b\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 02 13:01:29 crc kubenswrapper[4724]: I1002 13:01:29.068438 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-vbrhf" Oct 02 13:01:29 crc kubenswrapper[4724]: I1002 13:01:29.075600 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/75ffdd90-5d3e-419b-a17d-5ffced74428b-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"75ffdd90-5d3e-419b-a17d-5ffced74428b\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 02 13:01:29 crc kubenswrapper[4724]: I1002 13:01:29.128593 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 13:01:29 crc kubenswrapper[4724]: E1002 13:01:29.128686 4724 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 13:01:29.628669318 +0000 UTC m=+154.083428439 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 13:01:29 crc kubenswrapper[4724]: I1002 13:01:29.128988 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-v5w5d\" (UID: \"7de6408f-76b4-4a9a-bf83-fe6c4e60848e\") " pod="openshift-image-registry/image-registry-697d97f7c8-v5w5d" Oct 02 13:01:29 crc kubenswrapper[4724]: E1002 13:01:29.129247 4724 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 13:01:29.629238912 +0000 UTC m=+154.083998033 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-v5w5d" (UID: "7de6408f-76b4-4a9a-bf83-fe6c4e60848e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 13:01:29 crc kubenswrapper[4724]: I1002 13:01:29.158850 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 02 13:01:29 crc kubenswrapper[4724]: I1002 13:01:29.218588 4724 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-config-operator/openshift-config-operator-7777fb866f-974gc" Oct 02 13:01:29 crc kubenswrapper[4724]: I1002 13:01:29.231307 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 13:01:29 crc kubenswrapper[4724]: E1002 13:01:29.231469 4724 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 13:01:29.731444818 +0000 UTC m=+154.186203939 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 13:01:29 crc kubenswrapper[4724]: I1002 13:01:29.231966 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-v5w5d\" (UID: \"7de6408f-76b4-4a9a-bf83-fe6c4e60848e\") " pod="openshift-image-registry/image-registry-697d97f7c8-v5w5d" Oct 02 13:01:29 crc kubenswrapper[4724]: E1002 13:01:29.232362 4724 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 13:01:29.732347361 +0000 UTC m=+154.187106482 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-v5w5d" (UID: "7de6408f-76b4-4a9a-bf83-fe6c4e60848e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 13:01:29 crc kubenswrapper[4724]: I1002 13:01:29.343521 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 13:01:29 crc kubenswrapper[4724]: E1002 13:01:29.344685 4724 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 13:01:29.844666102 +0000 UTC m=+154.299425223 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 13:01:29 crc kubenswrapper[4724]: I1002 13:01:29.346086 4724 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-x7mvk"] Oct 02 13:01:29 crc kubenswrapper[4724]: I1002 13:01:29.449780 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-v5w5d\" (UID: \"7de6408f-76b4-4a9a-bf83-fe6c4e60848e\") " pod="openshift-image-registry/image-registry-697d97f7c8-v5w5d" Oct 02 13:01:29 crc kubenswrapper[4724]: E1002 13:01:29.450362 4724 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 13:01:29.950349526 +0000 UTC m=+154.405108647 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-v5w5d" (UID: "7de6408f-76b4-4a9a-bf83-fe6c4e60848e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 13:01:29 crc kubenswrapper[4724]: I1002 13:01:29.551276 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 13:01:29 crc kubenswrapper[4724]: E1002 13:01:29.551625 4724 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 13:01:30.051610768 +0000 UTC m=+154.506369889 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 13:01:29 crc kubenswrapper[4724]: I1002 13:01:29.557647 4724 patch_prober.go:28] interesting pod/router-default-5444994796-n5rln container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 02 13:01:29 crc kubenswrapper[4724]: [-]has-synced failed: reason withheld Oct 02 13:01:29 crc kubenswrapper[4724]: [+]process-running ok Oct 02 13:01:29 crc kubenswrapper[4724]: healthz check failed Oct 02 13:01:29 crc kubenswrapper[4724]: I1002 13:01:29.557700 4724 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-n5rln" podUID="6578bd1a-eaad-452a-adf5-7f3e34838677" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 02 13:01:29 crc kubenswrapper[4724]: I1002 13:01:29.654394 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-v5w5d\" (UID: \"7de6408f-76b4-4a9a-bf83-fe6c4e60848e\") " pod="openshift-image-registry/image-registry-697d97f7c8-v5w5d" Oct 02 13:01:29 crc kubenswrapper[4724]: E1002 13:01:29.655073 4724 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 13:01:30.155040715 +0000 UTC m=+154.609799846 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-v5w5d" (UID: "7de6408f-76b4-4a9a-bf83-fe6c4e60848e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 13:01:29 crc kubenswrapper[4724]: I1002 13:01:29.668901 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-x7mvk" event={"ID":"d1808303-74a2-424b-9dd4-64838d28a1c7","Type":"ContainerStarted","Data":"6cbe463c41af50340c677cd9ac3c3e8de0bb2f42ce683760e82c9192fe790af2"} Oct 02 13:01:29 crc kubenswrapper[4724]: I1002 13:01:29.672809 4724 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-jvpvx"] Oct 02 13:01:29 crc kubenswrapper[4724]: I1002 13:01:29.690915 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"9a0babb1ec37a82e763e8846fb436bbac85622f4ddaa5f974055d57006f81e51"} Oct 02 13:01:29 crc kubenswrapper[4724]: I1002 13:01:29.691253 4724 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 13:01:29 crc kubenswrapper[4724]: I1002 13:01:29.755344 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 13:01:29 crc kubenswrapper[4724]: E1002 13:01:29.755699 4724 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 13:01:30.255685401 +0000 UTC m=+154.710444522 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 13:01:29 crc kubenswrapper[4724]: I1002 13:01:29.793093 4724 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-vqs8m"] Oct 02 13:01:29 crc kubenswrapper[4724]: I1002 13:01:29.859431 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-v5w5d\" (UID: \"7de6408f-76b4-4a9a-bf83-fe6c4e60848e\") " pod="openshift-image-registry/image-registry-697d97f7c8-v5w5d" Oct 02 13:01:29 crc kubenswrapper[4724]: E1002 13:01:29.861159 4724 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 13:01:30.361143469 +0000 UTC m=+154.815902590 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-v5w5d" (UID: "7de6408f-76b4-4a9a-bf83-fe6c4e60848e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 13:01:29 crc kubenswrapper[4724]: I1002 13:01:29.963156 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 13:01:29 crc kubenswrapper[4724]: E1002 13:01:29.963876 4724 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 13:01:30.463858618 +0000 UTC m=+154.918617739 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 13:01:29 crc kubenswrapper[4724]: I1002 13:01:29.968501 4724 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Oct 02 13:01:30 crc kubenswrapper[4724]: I1002 13:01:30.029308 4724 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-vbrhf"] Oct 02 13:01:30 crc kubenswrapper[4724]: W1002 13:01:30.049320 4724 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd3ea65d9_9080_4ccb_837c_ed218fce942c.slice/crio-2845d34547b3f270ab89794d03230275a4cf7551b5d4d47641452164adea77cb WatchSource:0}: Error finding container 2845d34547b3f270ab89794d03230275a4cf7551b5d4d47641452164adea77cb: Status 404 returned error can't find the container with id 2845d34547b3f270ab89794d03230275a4cf7551b5d4d47641452164adea77cb Oct 02 13:01:30 crc kubenswrapper[4724]: I1002 13:01:30.065036 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-v5w5d\" (UID: \"7de6408f-76b4-4a9a-bf83-fe6c4e60848e\") " pod="openshift-image-registry/image-registry-697d97f7c8-v5w5d" Oct 02 13:01:30 crc kubenswrapper[4724]: I1002 13:01:30.065860 4724 patch_prober.go:28] interesting pod/apiserver-76f77b778f-xjrx9 container/openshift-apiserver namespace/openshift-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Oct 02 13:01:30 crc kubenswrapper[4724]: [+]log ok Oct 02 13:01:30 crc kubenswrapper[4724]: [+]etcd ok Oct 02 13:01:30 crc kubenswrapper[4724]: [+]poststarthook/start-apiserver-admission-initializer ok Oct 02 13:01:30 crc kubenswrapper[4724]: [+]poststarthook/generic-apiserver-start-informers ok Oct 02 13:01:30 crc kubenswrapper[4724]: [+]poststarthook/max-in-flight-filter ok Oct 02 13:01:30 crc kubenswrapper[4724]: [+]poststarthook/storage-object-count-tracker-hook ok Oct 02 13:01:30 crc kubenswrapper[4724]: [+]poststarthook/image.openshift.io-apiserver-caches ok Oct 02 13:01:30 crc kubenswrapper[4724]: [-]poststarthook/authorization.openshift.io-bootstrapclusterroles failed: reason withheld Oct 02 13:01:30 crc kubenswrapper[4724]: [+]poststarthook/authorization.openshift.io-ensurenodebootstrap-sa ok Oct 02 13:01:30 crc kubenswrapper[4724]: [+]poststarthook/project.openshift.io-projectcache ok Oct 02 13:01:30 crc kubenswrapper[4724]: [+]poststarthook/project.openshift.io-projectauthorizationcache ok Oct 02 13:01:30 crc kubenswrapper[4724]: [+]poststarthook/openshift.io-startinformers ok Oct 02 13:01:30 crc kubenswrapper[4724]: [+]poststarthook/openshift.io-restmapperupdater ok Oct 02 13:01:30 crc kubenswrapper[4724]: [+]poststarthook/quota.openshift.io-clusterquotamapping ok Oct 02 13:01:30 crc kubenswrapper[4724]: livez check failed Oct 02 13:01:30 crc kubenswrapper[4724]: I1002 13:01:30.065898 4724 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-apiserver/apiserver-76f77b778f-xjrx9" podUID="da7051e3-8a79-43e7-9016-9d492b51a9fd" containerName="openshift-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 02 13:01:30 crc kubenswrapper[4724]: E1002 13:01:30.066322 4724 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 13:01:30.565461588 +0000 UTC m=+155.020220709 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-v5w5d" (UID: "7de6408f-76b4-4a9a-bf83-fe6c4e60848e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 13:01:30 crc kubenswrapper[4724]: I1002 13:01:30.084884 4724 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-qxfvd" Oct 02 13:01:30 crc kubenswrapper[4724]: I1002 13:01:30.133240 4724 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-879f6c89f-zpmh7" Oct 02 13:01:30 crc kubenswrapper[4724]: I1002 13:01:30.166373 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 13:01:30 crc kubenswrapper[4724]: E1002 13:01:30.166944 4724 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 13:01:30.666922235 +0000 UTC m=+155.121681356 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 13:01:30 crc kubenswrapper[4724]: I1002 13:01:30.171902 4724 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-n4xw8" Oct 02 13:01:30 crc kubenswrapper[4724]: I1002 13:01:30.193158 4724 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-n4xw8" Oct 02 13:01:30 crc kubenswrapper[4724]: I1002 13:01:30.268090 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-v5w5d\" (UID: \"7de6408f-76b4-4a9a-bf83-fe6c4e60848e\") " pod="openshift-image-registry/image-registry-697d97f7c8-v5w5d" Oct 02 13:01:30 crc kubenswrapper[4724]: E1002 13:01:30.269333 4724 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 13:01:30.769321986 +0000 UTC m=+155.224081107 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-v5w5d" (UID: "7de6408f-76b4-4a9a-bf83-fe6c4e60848e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 13:01:30 crc kubenswrapper[4724]: I1002 13:01:30.282522 4724 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-gffjg"] Oct 02 13:01:30 crc kubenswrapper[4724]: I1002 13:01:30.283480 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-gffjg" Oct 02 13:01:30 crc kubenswrapper[4724]: I1002 13:01:30.295885 4724 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Oct 02 13:01:30 crc kubenswrapper[4724]: I1002 13:01:30.366420 4724 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-gffjg"] Oct 02 13:01:30 crc kubenswrapper[4724]: I1002 13:01:30.369955 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 13:01:30 crc kubenswrapper[4724]: E1002 13:01:30.370124 4724 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 13:01:30.870094325 +0000 UTC m=+155.324853446 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 13:01:30 crc kubenswrapper[4724]: I1002 13:01:30.370230 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/382d6bf1-1008-4b35-a7d2-fee3a1df7191-catalog-content\") pod \"redhat-marketplace-gffjg\" (UID: \"382d6bf1-1008-4b35-a7d2-fee3a1df7191\") " pod="openshift-marketplace/redhat-marketplace-gffjg" Oct 02 13:01:30 crc kubenswrapper[4724]: I1002 13:01:30.370260 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/382d6bf1-1008-4b35-a7d2-fee3a1df7191-utilities\") pod \"redhat-marketplace-gffjg\" (UID: \"382d6bf1-1008-4b35-a7d2-fee3a1df7191\") " pod="openshift-marketplace/redhat-marketplace-gffjg" Oct 02 13:01:30 crc kubenswrapper[4724]: I1002 13:01:30.370294 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-v5w5d\" (UID: \"7de6408f-76b4-4a9a-bf83-fe6c4e60848e\") " pod="openshift-image-registry/image-registry-697d97f7c8-v5w5d" Oct 02 13:01:30 crc kubenswrapper[4724]: I1002 13:01:30.370333 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-spzpn\" (UniqueName: \"kubernetes.io/projected/382d6bf1-1008-4b35-a7d2-fee3a1df7191-kube-api-access-spzpn\") pod \"redhat-marketplace-gffjg\" (UID: \"382d6bf1-1008-4b35-a7d2-fee3a1df7191\") " pod="openshift-marketplace/redhat-marketplace-gffjg" Oct 02 13:01:30 crc kubenswrapper[4724]: E1002 13:01:30.370628 4724 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 13:01:30.870616439 +0000 UTC m=+155.325375620 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-v5w5d" (UID: "7de6408f-76b4-4a9a-bf83-fe6c4e60848e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 13:01:30 crc kubenswrapper[4724]: I1002 13:01:30.472074 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 13:01:30 crc kubenswrapper[4724]: I1002 13:01:30.472380 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-spzpn\" (UniqueName: \"kubernetes.io/projected/382d6bf1-1008-4b35-a7d2-fee3a1df7191-kube-api-access-spzpn\") pod \"redhat-marketplace-gffjg\" (UID: \"382d6bf1-1008-4b35-a7d2-fee3a1df7191\") " pod="openshift-marketplace/redhat-marketplace-gffjg" Oct 02 13:01:30 crc kubenswrapper[4724]: I1002 13:01:30.472462 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/382d6bf1-1008-4b35-a7d2-fee3a1df7191-catalog-content\") pod \"redhat-marketplace-gffjg\" (UID: \"382d6bf1-1008-4b35-a7d2-fee3a1df7191\") " pod="openshift-marketplace/redhat-marketplace-gffjg" Oct 02 13:01:30 crc kubenswrapper[4724]: I1002 13:01:30.472497 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/382d6bf1-1008-4b35-a7d2-fee3a1df7191-utilities\") pod \"redhat-marketplace-gffjg\" (UID: \"382d6bf1-1008-4b35-a7d2-fee3a1df7191\") " pod="openshift-marketplace/redhat-marketplace-gffjg" Oct 02 13:01:30 crc kubenswrapper[4724]: E1002 13:01:30.473059 4724 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 13:01:30.97303794 +0000 UTC m=+155.427797061 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 13:01:30 crc kubenswrapper[4724]: I1002 13:01:30.473410 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/382d6bf1-1008-4b35-a7d2-fee3a1df7191-utilities\") pod \"redhat-marketplace-gffjg\" (UID: \"382d6bf1-1008-4b35-a7d2-fee3a1df7191\") " pod="openshift-marketplace/redhat-marketplace-gffjg" Oct 02 13:01:30 crc kubenswrapper[4724]: I1002 13:01:30.473527 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/382d6bf1-1008-4b35-a7d2-fee3a1df7191-catalog-content\") pod \"redhat-marketplace-gffjg\" (UID: \"382d6bf1-1008-4b35-a7d2-fee3a1df7191\") " pod="openshift-marketplace/redhat-marketplace-gffjg" Oct 02 13:01:30 crc kubenswrapper[4724]: I1002 13:01:30.498316 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-spzpn\" (UniqueName: \"kubernetes.io/projected/382d6bf1-1008-4b35-a7d2-fee3a1df7191-kube-api-access-spzpn\") pod \"redhat-marketplace-gffjg\" (UID: \"382d6bf1-1008-4b35-a7d2-fee3a1df7191\") " pod="openshift-marketplace/redhat-marketplace-gffjg" Oct 02 13:01:30 crc kubenswrapper[4724]: I1002 13:01:30.547894 4724 patch_prober.go:28] interesting pod/router-default-5444994796-n5rln container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 02 13:01:30 crc kubenswrapper[4724]: [-]has-synced failed: reason withheld Oct 02 13:01:30 crc kubenswrapper[4724]: [+]process-running ok Oct 02 13:01:30 crc kubenswrapper[4724]: healthz check failed Oct 02 13:01:30 crc kubenswrapper[4724]: I1002 13:01:30.548139 4724 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-n5rln" podUID="6578bd1a-eaad-452a-adf5-7f3e34838677" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 02 13:01:30 crc kubenswrapper[4724]: I1002 13:01:30.574138 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-v5w5d\" (UID: \"7de6408f-76b4-4a9a-bf83-fe6c4e60848e\") " pod="openshift-image-registry/image-registry-697d97f7c8-v5w5d" Oct 02 13:01:30 crc kubenswrapper[4724]: E1002 13:01:30.574520 4724 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 13:01:31.074505207 +0000 UTC m=+155.529264328 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-v5w5d" (UID: "7de6408f-76b4-4a9a-bf83-fe6c4e60848e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 13:01:30 crc kubenswrapper[4724]: I1002 13:01:30.655473 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-gffjg" Oct 02 13:01:30 crc kubenswrapper[4724]: I1002 13:01:30.669883 4724 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-sqwtq"] Oct 02 13:01:30 crc kubenswrapper[4724]: I1002 13:01:30.671271 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-sqwtq" Oct 02 13:01:30 crc kubenswrapper[4724]: I1002 13:01:30.675078 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 13:01:30 crc kubenswrapper[4724]: E1002 13:01:30.675319 4724 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 13:01:31.175295637 +0000 UTC m=+155.630054758 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 13:01:30 crc kubenswrapper[4724]: I1002 13:01:30.675380 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-v5w5d\" (UID: \"7de6408f-76b4-4a9a-bf83-fe6c4e60848e\") " pod="openshift-image-registry/image-registry-697d97f7c8-v5w5d" Oct 02 13:01:30 crc kubenswrapper[4724]: E1002 13:01:30.675707 4724 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 13:01:31.175698997 +0000 UTC m=+155.630458118 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-v5w5d" (UID: "7de6408f-76b4-4a9a-bf83-fe6c4e60848e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 13:01:30 crc kubenswrapper[4724]: I1002 13:01:30.686397 4724 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-sqwtq"] Oct 02 13:01:30 crc kubenswrapper[4724]: I1002 13:01:30.708879 4724 generic.go:334] "Generic (PLEG): container finished" podID="a4fcd146-2be5-4691-a6db-4e9ed60b4711" containerID="d703d45d5eb30e8f518382b3e220995248d4311408013d9d23f1550c4ff2cabe" exitCode=0 Oct 02 13:01:30 crc kubenswrapper[4724]: I1002 13:01:30.708985 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jvpvx" event={"ID":"a4fcd146-2be5-4691-a6db-4e9ed60b4711","Type":"ContainerDied","Data":"d703d45d5eb30e8f518382b3e220995248d4311408013d9d23f1550c4ff2cabe"} Oct 02 13:01:30 crc kubenswrapper[4724]: I1002 13:01:30.709012 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jvpvx" event={"ID":"a4fcd146-2be5-4691-a6db-4e9ed60b4711","Type":"ContainerStarted","Data":"71011b27f9cac6dc14ee558a6dd8bba6b9bf3b68a7c036aeab241656c3950f66"} Oct 02 13:01:30 crc kubenswrapper[4724]: I1002 13:01:30.711239 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"75ffdd90-5d3e-419b-a17d-5ffced74428b","Type":"ContainerStarted","Data":"41340a35769bf7e557677c19ba3eb3d8b5ab6990a44a363a716547cb7562dba2"} Oct 02 13:01:30 crc kubenswrapper[4724]: I1002 13:01:30.711290 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"75ffdd90-5d3e-419b-a17d-5ffced74428b","Type":"ContainerStarted","Data":"7961cc56e89f61236b362558916f9d837100a801e49cce73ff8a8a39c8865467"} Oct 02 13:01:30 crc kubenswrapper[4724]: I1002 13:01:30.714584 4724 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 02 13:01:30 crc kubenswrapper[4724]: I1002 13:01:30.718477 4724 generic.go:334] "Generic (PLEG): container finished" podID="d3ea65d9-9080-4ccb-837c-ed218fce942c" containerID="877923f5b01e3fbc5136808220a3012fe4fd40676b8aa3929dbe6da9dba05319" exitCode=0 Oct 02 13:01:30 crc kubenswrapper[4724]: I1002 13:01:30.718646 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vbrhf" event={"ID":"d3ea65d9-9080-4ccb-837c-ed218fce942c","Type":"ContainerDied","Data":"877923f5b01e3fbc5136808220a3012fe4fd40676b8aa3929dbe6da9dba05319"} Oct 02 13:01:30 crc kubenswrapper[4724]: I1002 13:01:30.718711 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vbrhf" event={"ID":"d3ea65d9-9080-4ccb-837c-ed218fce942c","Type":"ContainerStarted","Data":"2845d34547b3f270ab89794d03230275a4cf7551b5d4d47641452164adea77cb"} Oct 02 13:01:30 crc kubenswrapper[4724]: I1002 13:01:30.724393 4724 generic.go:334] "Generic (PLEG): container finished" podID="d1808303-74a2-424b-9dd4-64838d28a1c7" containerID="5c76ea0a4df2f657bdf82c01366fa9a0e66416917f2b5e476c5e5b9aeda8d94e" exitCode=0 Oct 02 13:01:30 crc kubenswrapper[4724]: I1002 13:01:30.724472 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-x7mvk" event={"ID":"d1808303-74a2-424b-9dd4-64838d28a1c7","Type":"ContainerDied","Data":"5c76ea0a4df2f657bdf82c01366fa9a0e66416917f2b5e476c5e5b9aeda8d94e"} Oct 02 13:01:30 crc kubenswrapper[4724]: I1002 13:01:30.732408 4724 generic.go:334] "Generic (PLEG): container finished" podID="53ecbc18-3d93-4ee9-bd02-e3e99db2a82d" containerID="29a766e792b317ce1f19e9c92d3680cd801b8395ee30ccb12ef96e3bbf18f6b7" exitCode=0 Oct 02 13:01:30 crc kubenswrapper[4724]: I1002 13:01:30.732497 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vqs8m" event={"ID":"53ecbc18-3d93-4ee9-bd02-e3e99db2a82d","Type":"ContainerDied","Data":"29a766e792b317ce1f19e9c92d3680cd801b8395ee30ccb12ef96e3bbf18f6b7"} Oct 02 13:01:30 crc kubenswrapper[4724]: I1002 13:01:30.732525 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vqs8m" event={"ID":"53ecbc18-3d93-4ee9-bd02-e3e99db2a82d","Type":"ContainerStarted","Data":"47dfa156b3695a99d262407baf1ed396e38a4bc4571440136a6c353b227f4692"} Oct 02 13:01:30 crc kubenswrapper[4724]: I1002 13:01:30.737704 4724 generic.go:334] "Generic (PLEG): container finished" podID="59bd4d63-57bf-4b08-b7f1-f0c7d733e571" containerID="b0b9f435d0980740c089d63f6cd9fc7e33fbde41c7fa275fe2ad3c87a57bd9bd" exitCode=0 Oct 02 13:01:30 crc kubenswrapper[4724]: I1002 13:01:30.737847 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29323500-vz2xt" event={"ID":"59bd4d63-57bf-4b08-b7f1-f0c7d733e571","Type":"ContainerDied","Data":"b0b9f435d0980740c089d63f6cd9fc7e33fbde41c7fa275fe2ad3c87a57bd9bd"} Oct 02 13:01:30 crc kubenswrapper[4724]: I1002 13:01:30.767907 4724 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/revision-pruner-9-crc" podStartSLOduration=2.767888829 podStartE2EDuration="2.767888829s" podCreationTimestamp="2025-10-02 13:01:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 13:01:30.767480569 +0000 UTC m=+155.222239690" watchObservedRunningTime="2025-10-02 13:01:30.767888829 +0000 UTC m=+155.222647950" Oct 02 13:01:30 crc kubenswrapper[4724]: I1002 13:01:30.776600 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 13:01:30 crc kubenswrapper[4724]: I1002 13:01:30.776878 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vbmm8\" (UniqueName: \"kubernetes.io/projected/e0256dd2-79a4-46fb-8698-ea99d23a67de-kube-api-access-vbmm8\") pod \"redhat-marketplace-sqwtq\" (UID: \"e0256dd2-79a4-46fb-8698-ea99d23a67de\") " pod="openshift-marketplace/redhat-marketplace-sqwtq" Oct 02 13:01:30 crc kubenswrapper[4724]: I1002 13:01:30.776960 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e0256dd2-79a4-46fb-8698-ea99d23a67de-catalog-content\") pod \"redhat-marketplace-sqwtq\" (UID: \"e0256dd2-79a4-46fb-8698-ea99d23a67de\") " pod="openshift-marketplace/redhat-marketplace-sqwtq" Oct 02 13:01:30 crc kubenswrapper[4724]: I1002 13:01:30.776991 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e0256dd2-79a4-46fb-8698-ea99d23a67de-utilities\") pod \"redhat-marketplace-sqwtq\" (UID: \"e0256dd2-79a4-46fb-8698-ea99d23a67de\") " pod="openshift-marketplace/redhat-marketplace-sqwtq" Oct 02 13:01:30 crc kubenswrapper[4724]: E1002 13:01:30.777134 4724 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 13:01:31.277111943 +0000 UTC m=+155.731871074 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 13:01:30 crc kubenswrapper[4724]: I1002 13:01:30.830847 4724 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-558db77b4-m8t6v" Oct 02 13:01:30 crc kubenswrapper[4724]: I1002 13:01:30.838265 4724 patch_prober.go:28] interesting pod/downloads-7954f5f757-kl4xt container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.12:8080/\": dial tcp 10.217.0.12:8080: connect: connection refused" start-of-body= Oct 02 13:01:30 crc kubenswrapper[4724]: I1002 13:01:30.838336 4724 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-kl4xt" podUID="84a3f9d9-8a1f-45ed-ad6f-3c8eb02738e7" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.12:8080/\": dial tcp 10.217.0.12:8080: connect: connection refused" Oct 02 13:01:30 crc kubenswrapper[4724]: I1002 13:01:30.838270 4724 patch_prober.go:28] interesting pod/downloads-7954f5f757-kl4xt container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.12:8080/\": dial tcp 10.217.0.12:8080: connect: connection refused" start-of-body= Oct 02 13:01:30 crc kubenswrapper[4724]: I1002 13:01:30.838422 4724 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-kl4xt" podUID="84a3f9d9-8a1f-45ed-ad6f-3c8eb02738e7" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.12:8080/\": dial tcp 10.217.0.12:8080: connect: connection refused" Oct 02 13:01:30 crc kubenswrapper[4724]: I1002 13:01:30.878541 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e0256dd2-79a4-46fb-8698-ea99d23a67de-catalog-content\") pod \"redhat-marketplace-sqwtq\" (UID: \"e0256dd2-79a4-46fb-8698-ea99d23a67de\") " pod="openshift-marketplace/redhat-marketplace-sqwtq" Oct 02 13:01:30 crc kubenswrapper[4724]: I1002 13:01:30.878627 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e0256dd2-79a4-46fb-8698-ea99d23a67de-utilities\") pod \"redhat-marketplace-sqwtq\" (UID: \"e0256dd2-79a4-46fb-8698-ea99d23a67de\") " pod="openshift-marketplace/redhat-marketplace-sqwtq" Oct 02 13:01:30 crc kubenswrapper[4724]: I1002 13:01:30.878766 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-v5w5d\" (UID: \"7de6408f-76b4-4a9a-bf83-fe6c4e60848e\") " pod="openshift-image-registry/image-registry-697d97f7c8-v5w5d" Oct 02 13:01:30 crc kubenswrapper[4724]: I1002 13:01:30.878850 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vbmm8\" (UniqueName: \"kubernetes.io/projected/e0256dd2-79a4-46fb-8698-ea99d23a67de-kube-api-access-vbmm8\") pod \"redhat-marketplace-sqwtq\" (UID: \"e0256dd2-79a4-46fb-8698-ea99d23a67de\") " pod="openshift-marketplace/redhat-marketplace-sqwtq" Oct 02 13:01:30 crc kubenswrapper[4724]: I1002 13:01:30.879831 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e0256dd2-79a4-46fb-8698-ea99d23a67de-utilities\") pod \"redhat-marketplace-sqwtq\" (UID: \"e0256dd2-79a4-46fb-8698-ea99d23a67de\") " pod="openshift-marketplace/redhat-marketplace-sqwtq" Oct 02 13:01:30 crc kubenswrapper[4724]: I1002 13:01:30.888732 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e0256dd2-79a4-46fb-8698-ea99d23a67de-catalog-content\") pod \"redhat-marketplace-sqwtq\" (UID: \"e0256dd2-79a4-46fb-8698-ea99d23a67de\") " pod="openshift-marketplace/redhat-marketplace-sqwtq" Oct 02 13:01:30 crc kubenswrapper[4724]: E1002 13:01:30.889419 4724 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 13:01:31.389400114 +0000 UTC m=+155.844159235 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-v5w5d" (UID: "7de6408f-76b4-4a9a-bf83-fe6c4e60848e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 13:01:30 crc kubenswrapper[4724]: I1002 13:01:30.917653 4724 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-gffjg"] Oct 02 13:01:30 crc kubenswrapper[4724]: W1002 13:01:30.922126 4724 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod382d6bf1_1008_4b35_a7d2_fee3a1df7191.slice/crio-237d227c42ea57dd21478ffc56dd9d7c02f9b240c797af1741daf03b752bb0c9 WatchSource:0}: Error finding container 237d227c42ea57dd21478ffc56dd9d7c02f9b240c797af1741daf03b752bb0c9: Status 404 returned error can't find the container with id 237d227c42ea57dd21478ffc56dd9d7c02f9b240c797af1741daf03b752bb0c9 Oct 02 13:01:30 crc kubenswrapper[4724]: I1002 13:01:30.930679 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vbmm8\" (UniqueName: \"kubernetes.io/projected/e0256dd2-79a4-46fb-8698-ea99d23a67de-kube-api-access-vbmm8\") pod \"redhat-marketplace-sqwtq\" (UID: \"e0256dd2-79a4-46fb-8698-ea99d23a67de\") " pod="openshift-marketplace/redhat-marketplace-sqwtq" Oct 02 13:01:30 crc kubenswrapper[4724]: I1002 13:01:30.980014 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 13:01:30 crc kubenswrapper[4724]: E1002 13:01:30.981528 4724 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 13:01:31.481500764 +0000 UTC m=+155.936259945 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 13:01:31 crc kubenswrapper[4724]: I1002 13:01:31.032852 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-sqwtq" Oct 02 13:01:31 crc kubenswrapper[4724]: I1002 13:01:31.082222 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-v5w5d\" (UID: \"7de6408f-76b4-4a9a-bf83-fe6c4e60848e\") " pod="openshift-image-registry/image-registry-697d97f7c8-v5w5d" Oct 02 13:01:31 crc kubenswrapper[4724]: E1002 13:01:31.082610 4724 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 13:01:31.582595052 +0000 UTC m=+156.037354173 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-v5w5d" (UID: "7de6408f-76b4-4a9a-bf83-fe6c4e60848e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 13:01:31 crc kubenswrapper[4724]: I1002 13:01:31.183346 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 13:01:31 crc kubenswrapper[4724]: E1002 13:01:31.183601 4724 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 13:01:31.683529505 +0000 UTC m=+156.138288626 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 13:01:31 crc kubenswrapper[4724]: I1002 13:01:31.183851 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-v5w5d\" (UID: \"7de6408f-76b4-4a9a-bf83-fe6c4e60848e\") " pod="openshift-image-registry/image-registry-697d97f7c8-v5w5d" Oct 02 13:01:31 crc kubenswrapper[4724]: E1002 13:01:31.184183 4724 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 13:01:31.684175091 +0000 UTC m=+156.138934212 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-v5w5d" (UID: "7de6408f-76b4-4a9a-bf83-fe6c4e60848e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 13:01:31 crc kubenswrapper[4724]: I1002 13:01:31.256178 4724 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-8zmbp" Oct 02 13:01:31 crc kubenswrapper[4724]: I1002 13:01:31.273485 4724 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-kpsxg"] Oct 02 13:01:31 crc kubenswrapper[4724]: I1002 13:01:31.275742 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-kpsxg" Oct 02 13:01:31 crc kubenswrapper[4724]: I1002 13:01:31.277814 4724 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-sqwtq"] Oct 02 13:01:31 crc kubenswrapper[4724]: I1002 13:01:31.278584 4724 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Oct 02 13:01:31 crc kubenswrapper[4724]: I1002 13:01:31.283988 4724 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-8zmbp" Oct 02 13:01:31 crc kubenswrapper[4724]: I1002 13:01:31.284620 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 13:01:31 crc kubenswrapper[4724]: E1002 13:01:31.285184 4724 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 13:01:31.785166897 +0000 UTC m=+156.239926028 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 13:01:31 crc kubenswrapper[4724]: I1002 13:01:31.287474 4724 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-kpsxg"] Oct 02 13:01:31 crc kubenswrapper[4724]: I1002 13:01:31.299052 4724 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-d2rqx" Oct 02 13:01:31 crc kubenswrapper[4724]: I1002 13:01:31.386665 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-v5w5d\" (UID: \"7de6408f-76b4-4a9a-bf83-fe6c4e60848e\") " pod="openshift-image-registry/image-registry-697d97f7c8-v5w5d" Oct 02 13:01:31 crc kubenswrapper[4724]: I1002 13:01:31.386772 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b3d38739-e797-4866-aebe-290d90535c73-utilities\") pod \"redhat-operators-kpsxg\" (UID: \"b3d38739-e797-4866-aebe-290d90535c73\") " pod="openshift-marketplace/redhat-operators-kpsxg" Oct 02 13:01:31 crc kubenswrapper[4724]: I1002 13:01:31.386868 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cxvjs\" (UniqueName: \"kubernetes.io/projected/b3d38739-e797-4866-aebe-290d90535c73-kube-api-access-cxvjs\") pod \"redhat-operators-kpsxg\" (UID: \"b3d38739-e797-4866-aebe-290d90535c73\") " pod="openshift-marketplace/redhat-operators-kpsxg" Oct 02 13:01:31 crc kubenswrapper[4724]: I1002 13:01:31.386928 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b3d38739-e797-4866-aebe-290d90535c73-catalog-content\") pod \"redhat-operators-kpsxg\" (UID: \"b3d38739-e797-4866-aebe-290d90535c73\") " pod="openshift-marketplace/redhat-operators-kpsxg" Oct 02 13:01:31 crc kubenswrapper[4724]: E1002 13:01:31.387579 4724 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 13:01:31.887563027 +0000 UTC m=+156.342322148 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-v5w5d" (UID: "7de6408f-76b4-4a9a-bf83-fe6c4e60848e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 13:01:31 crc kubenswrapper[4724]: I1002 13:01:31.484558 4724 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-f9d7485db-lvb24" Oct 02 13:01:31 crc kubenswrapper[4724]: I1002 13:01:31.484601 4724 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-f9d7485db-lvb24" Oct 02 13:01:31 crc kubenswrapper[4724]: I1002 13:01:31.488518 4724 patch_prober.go:28] interesting pod/console-f9d7485db-lvb24 container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.15:8443/health\": dial tcp 10.217.0.15:8443: connect: connection refused" start-of-body= Oct 02 13:01:31 crc kubenswrapper[4724]: I1002 13:01:31.488594 4724 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-lvb24" podUID="131c7969-a8c8-4cfc-b655-ac3d400fae1b" containerName="console" probeResult="failure" output="Get \"https://10.217.0.15:8443/health\": dial tcp 10.217.0.15:8443: connect: connection refused" Oct 02 13:01:31 crc kubenswrapper[4724]: I1002 13:01:31.493401 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 13:01:31 crc kubenswrapper[4724]: E1002 13:01:31.493717 4724 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 13:01:31.993698852 +0000 UTC m=+156.448457973 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 13:01:31 crc kubenswrapper[4724]: I1002 13:01:31.493771 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-v5w5d\" (UID: \"7de6408f-76b4-4a9a-bf83-fe6c4e60848e\") " pod="openshift-image-registry/image-registry-697d97f7c8-v5w5d" Oct 02 13:01:31 crc kubenswrapper[4724]: I1002 13:01:31.493838 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b3d38739-e797-4866-aebe-290d90535c73-utilities\") pod \"redhat-operators-kpsxg\" (UID: \"b3d38739-e797-4866-aebe-290d90535c73\") " pod="openshift-marketplace/redhat-operators-kpsxg" Oct 02 13:01:31 crc kubenswrapper[4724]: I1002 13:01:31.493910 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cxvjs\" (UniqueName: \"kubernetes.io/projected/b3d38739-e797-4866-aebe-290d90535c73-kube-api-access-cxvjs\") pod \"redhat-operators-kpsxg\" (UID: \"b3d38739-e797-4866-aebe-290d90535c73\") " pod="openshift-marketplace/redhat-operators-kpsxg" Oct 02 13:01:31 crc kubenswrapper[4724]: I1002 13:01:31.493984 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b3d38739-e797-4866-aebe-290d90535c73-catalog-content\") pod \"redhat-operators-kpsxg\" (UID: \"b3d38739-e797-4866-aebe-290d90535c73\") " pod="openshift-marketplace/redhat-operators-kpsxg" Oct 02 13:01:31 crc kubenswrapper[4724]: E1002 13:01:31.494192 4724 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 13:01:31.994171584 +0000 UTC m=+156.448930755 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-v5w5d" (UID: "7de6408f-76b4-4a9a-bf83-fe6c4e60848e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 13:01:31 crc kubenswrapper[4724]: I1002 13:01:31.494381 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b3d38739-e797-4866-aebe-290d90535c73-utilities\") pod \"redhat-operators-kpsxg\" (UID: \"b3d38739-e797-4866-aebe-290d90535c73\") " pod="openshift-marketplace/redhat-operators-kpsxg" Oct 02 13:01:31 crc kubenswrapper[4724]: I1002 13:01:31.543975 4724 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ingress/router-default-5444994796-n5rln" Oct 02 13:01:31 crc kubenswrapper[4724]: I1002 13:01:31.546911 4724 patch_prober.go:28] interesting pod/router-default-5444994796-n5rln container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 02 13:01:31 crc kubenswrapper[4724]: [-]has-synced failed: reason withheld Oct 02 13:01:31 crc kubenswrapper[4724]: [+]process-running ok Oct 02 13:01:31 crc kubenswrapper[4724]: healthz check failed Oct 02 13:01:31 crc kubenswrapper[4724]: I1002 13:01:31.546980 4724 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-n5rln" podUID="6578bd1a-eaad-452a-adf5-7f3e34838677" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 02 13:01:31 crc kubenswrapper[4724]: I1002 13:01:31.550508 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b3d38739-e797-4866-aebe-290d90535c73-catalog-content\") pod \"redhat-operators-kpsxg\" (UID: \"b3d38739-e797-4866-aebe-290d90535c73\") " pod="openshift-marketplace/redhat-operators-kpsxg" Oct 02 13:01:31 crc kubenswrapper[4724]: I1002 13:01:31.553298 4724 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-58897d9998-v94bt" Oct 02 13:01:31 crc kubenswrapper[4724]: I1002 13:01:31.562865 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cxvjs\" (UniqueName: \"kubernetes.io/projected/b3d38739-e797-4866-aebe-290d90535c73-kube-api-access-cxvjs\") pod \"redhat-operators-kpsxg\" (UID: \"b3d38739-e797-4866-aebe-290d90535c73\") " pod="openshift-marketplace/redhat-operators-kpsxg" Oct 02 13:01:31 crc kubenswrapper[4724]: I1002 13:01:31.594741 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 13:01:31 crc kubenswrapper[4724]: E1002 13:01:31.594922 4724 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 13:01:32.094897323 +0000 UTC m=+156.549656444 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 13:01:31 crc kubenswrapper[4724]: I1002 13:01:31.596115 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-v5w5d\" (UID: \"7de6408f-76b4-4a9a-bf83-fe6c4e60848e\") " pod="openshift-image-registry/image-registry-697d97f7c8-v5w5d" Oct 02 13:01:31 crc kubenswrapper[4724]: E1002 13:01:31.596505 4724 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 13:01:32.096487963 +0000 UTC m=+156.551247144 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-v5w5d" (UID: "7de6408f-76b4-4a9a-bf83-fe6c4e60848e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 13:01:31 crc kubenswrapper[4724]: I1002 13:01:31.634241 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-kpsxg" Oct 02 13:01:31 crc kubenswrapper[4724]: I1002 13:01:31.672257 4724 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-5mv8k"] Oct 02 13:01:31 crc kubenswrapper[4724]: I1002 13:01:31.673253 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-5mv8k" Oct 02 13:01:31 crc kubenswrapper[4724]: I1002 13:01:31.678434 4724 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-5mv8k"] Oct 02 13:01:31 crc kubenswrapper[4724]: I1002 13:01:31.697667 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 13:01:31 crc kubenswrapper[4724]: E1002 13:01:31.698413 4724 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 13:01:32.198385931 +0000 UTC m=+156.653145092 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 13:01:31 crc kubenswrapper[4724]: I1002 13:01:31.698603 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-v5w5d\" (UID: \"7de6408f-76b4-4a9a-bf83-fe6c4e60848e\") " pod="openshift-image-registry/image-registry-697d97f7c8-v5w5d" Oct 02 13:01:31 crc kubenswrapper[4724]: E1002 13:01:31.699518 4724 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 13:01:32.199507039 +0000 UTC m=+156.654266240 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-v5w5d" (UID: "7de6408f-76b4-4a9a-bf83-fe6c4e60848e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 13:01:31 crc kubenswrapper[4724]: I1002 13:01:31.744682 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-sqwtq" event={"ID":"e0256dd2-79a4-46fb-8698-ea99d23a67de","Type":"ContainerStarted","Data":"b7db912730732343384974a313817912c3ccf840ebc17380fb4a7531d0012ddc"} Oct 02 13:01:31 crc kubenswrapper[4724]: I1002 13:01:31.747493 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-cwvv5" event={"ID":"c8812b3c-bd75-49a8-b2a7-3db91675fc09","Type":"ContainerStarted","Data":"04dad5cfe4e4ddf44392bb03d4669e66ddda8daa0b8861d4ad397bf922c3d952"} Oct 02 13:01:31 crc kubenswrapper[4724]: I1002 13:01:31.748511 4724 generic.go:334] "Generic (PLEG): container finished" podID="75ffdd90-5d3e-419b-a17d-5ffced74428b" containerID="41340a35769bf7e557677c19ba3eb3d8b5ab6990a44a363a716547cb7562dba2" exitCode=0 Oct 02 13:01:31 crc kubenswrapper[4724]: I1002 13:01:31.748652 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"75ffdd90-5d3e-419b-a17d-5ffced74428b","Type":"ContainerDied","Data":"41340a35769bf7e557677c19ba3eb3d8b5ab6990a44a363a716547cb7562dba2"} Oct 02 13:01:31 crc kubenswrapper[4724]: I1002 13:01:31.752466 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gffjg" event={"ID":"382d6bf1-1008-4b35-a7d2-fee3a1df7191","Type":"ContainerStarted","Data":"8673fdf725b29f43e6bcdaa68e8a5f2a24c2b78417df21a5234352edd816fc50"} Oct 02 13:01:31 crc kubenswrapper[4724]: I1002 13:01:31.752848 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gffjg" event={"ID":"382d6bf1-1008-4b35-a7d2-fee3a1df7191","Type":"ContainerStarted","Data":"237d227c42ea57dd21478ffc56dd9d7c02f9b240c797af1741daf03b752bb0c9"} Oct 02 13:01:31 crc kubenswrapper[4724]: I1002 13:01:31.799753 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 13:01:31 crc kubenswrapper[4724]: E1002 13:01:31.799956 4724 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 13:01:32.29992971 +0000 UTC m=+156.754688831 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 13:01:31 crc kubenswrapper[4724]: I1002 13:01:31.800057 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-v5w5d\" (UID: \"7de6408f-76b4-4a9a-bf83-fe6c4e60848e\") " pod="openshift-image-registry/image-registry-697d97f7c8-v5w5d" Oct 02 13:01:31 crc kubenswrapper[4724]: I1002 13:01:31.800104 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2tzv6\" (UniqueName: \"kubernetes.io/projected/77faf449-b0a3-48ee-b35b-48bc77531443-kube-api-access-2tzv6\") pod \"redhat-operators-5mv8k\" (UID: \"77faf449-b0a3-48ee-b35b-48bc77531443\") " pod="openshift-marketplace/redhat-operators-5mv8k" Oct 02 13:01:31 crc kubenswrapper[4724]: I1002 13:01:31.800168 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/77faf449-b0a3-48ee-b35b-48bc77531443-catalog-content\") pod \"redhat-operators-5mv8k\" (UID: \"77faf449-b0a3-48ee-b35b-48bc77531443\") " pod="openshift-marketplace/redhat-operators-5mv8k" Oct 02 13:01:31 crc kubenswrapper[4724]: I1002 13:01:31.800187 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/77faf449-b0a3-48ee-b35b-48bc77531443-utilities\") pod \"redhat-operators-5mv8k\" (UID: \"77faf449-b0a3-48ee-b35b-48bc77531443\") " pod="openshift-marketplace/redhat-operators-5mv8k" Oct 02 13:01:31 crc kubenswrapper[4724]: E1002 13:01:31.800553 4724 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 13:01:32.300527515 +0000 UTC m=+156.755286636 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-v5w5d" (UID: "7de6408f-76b4-4a9a-bf83-fe6c4e60848e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 13:01:31 crc kubenswrapper[4724]: I1002 13:01:31.849869 4724 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-kpsxg"] Oct 02 13:01:31 crc kubenswrapper[4724]: W1002 13:01:31.860941 4724 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb3d38739_e797_4866_aebe_290d90535c73.slice/crio-af41568fa000c117680a8c24b8be765baac88ed70e3cd427dc74304a1a794402 WatchSource:0}: Error finding container af41568fa000c117680a8c24b8be765baac88ed70e3cd427dc74304a1a794402: Status 404 returned error can't find the container with id af41568fa000c117680a8c24b8be765baac88ed70e3cd427dc74304a1a794402 Oct 02 13:01:31 crc kubenswrapper[4724]: I1002 13:01:31.896858 4724 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Oct 02 13:01:31 crc kubenswrapper[4724]: I1002 13:01:31.898109 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 02 13:01:31 crc kubenswrapper[4724]: I1002 13:01:31.909663 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 13:01:31 crc kubenswrapper[4724]: I1002 13:01:31.909895 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2tzv6\" (UniqueName: \"kubernetes.io/projected/77faf449-b0a3-48ee-b35b-48bc77531443-kube-api-access-2tzv6\") pod \"redhat-operators-5mv8k\" (UID: \"77faf449-b0a3-48ee-b35b-48bc77531443\") " pod="openshift-marketplace/redhat-operators-5mv8k" Oct 02 13:01:31 crc kubenswrapper[4724]: I1002 13:01:31.909950 4724 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Oct 02 13:01:31 crc kubenswrapper[4724]: E1002 13:01:31.909973 4724 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 13:01:32.409927523 +0000 UTC m=+156.864686674 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 13:01:31 crc kubenswrapper[4724]: I1002 13:01:31.910137 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/77faf449-b0a3-48ee-b35b-48bc77531443-catalog-content\") pod \"redhat-operators-5mv8k\" (UID: \"77faf449-b0a3-48ee-b35b-48bc77531443\") " pod="openshift-marketplace/redhat-operators-5mv8k" Oct 02 13:01:31 crc kubenswrapper[4724]: I1002 13:01:31.910181 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/77faf449-b0a3-48ee-b35b-48bc77531443-utilities\") pod \"redhat-operators-5mv8k\" (UID: \"77faf449-b0a3-48ee-b35b-48bc77531443\") " pod="openshift-marketplace/redhat-operators-5mv8k" Oct 02 13:01:31 crc kubenswrapper[4724]: I1002 13:01:31.909778 4724 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Oct 02 13:01:31 crc kubenswrapper[4724]: I1002 13:01:31.911895 4724 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Oct 02 13:01:31 crc kubenswrapper[4724]: I1002 13:01:31.912898 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/77faf449-b0a3-48ee-b35b-48bc77531443-utilities\") pod \"redhat-operators-5mv8k\" (UID: \"77faf449-b0a3-48ee-b35b-48bc77531443\") " pod="openshift-marketplace/redhat-operators-5mv8k" Oct 02 13:01:31 crc kubenswrapper[4724]: I1002 13:01:31.918228 4724 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-m6knf" Oct 02 13:01:31 crc kubenswrapper[4724]: I1002 13:01:31.918942 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/77faf449-b0a3-48ee-b35b-48bc77531443-catalog-content\") pod \"redhat-operators-5mv8k\" (UID: \"77faf449-b0a3-48ee-b35b-48bc77531443\") " pod="openshift-marketplace/redhat-operators-5mv8k" Oct 02 13:01:31 crc kubenswrapper[4724]: I1002 13:01:31.945171 4724 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-m6knf" Oct 02 13:01:31 crc kubenswrapper[4724]: I1002 13:01:31.945232 4724 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-j7cp6" Oct 02 13:01:31 crc kubenswrapper[4724]: I1002 13:01:31.946738 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2tzv6\" (UniqueName: \"kubernetes.io/projected/77faf449-b0a3-48ee-b35b-48bc77531443-kube-api-access-2tzv6\") pod \"redhat-operators-5mv8k\" (UID: \"77faf449-b0a3-48ee-b35b-48bc77531443\") " pod="openshift-marketplace/redhat-operators-5mv8k" Oct 02 13:01:31 crc kubenswrapper[4724]: I1002 13:01:31.987760 4724 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29323500-vz2xt" Oct 02 13:01:32 crc kubenswrapper[4724]: I1002 13:01:32.012463 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-v5w5d\" (UID: \"7de6408f-76b4-4a9a-bf83-fe6c4e60848e\") " pod="openshift-image-registry/image-registry-697d97f7c8-v5w5d" Oct 02 13:01:32 crc kubenswrapper[4724]: I1002 13:01:32.012602 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ca0192d8-a2a6-49b7-badb-253a4ec414e6-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"ca0192d8-a2a6-49b7-badb-253a4ec414e6\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 02 13:01:32 crc kubenswrapper[4724]: I1002 13:01:32.012663 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/ca0192d8-a2a6-49b7-badb-253a4ec414e6-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"ca0192d8-a2a6-49b7-badb-253a4ec414e6\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 02 13:01:32 crc kubenswrapper[4724]: E1002 13:01:32.012980 4724 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 13:01:32.512963909 +0000 UTC m=+156.967723030 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-v5w5d" (UID: "7de6408f-76b4-4a9a-bf83-fe6c4e60848e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 13:01:32 crc kubenswrapper[4724]: I1002 13:01:32.033022 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-5mv8k" Oct 02 13:01:32 crc kubenswrapper[4724]: I1002 13:01:32.113710 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/59bd4d63-57bf-4b08-b7f1-f0c7d733e571-secret-volume\") pod \"59bd4d63-57bf-4b08-b7f1-f0c7d733e571\" (UID: \"59bd4d63-57bf-4b08-b7f1-f0c7d733e571\") " Oct 02 13:01:32 crc kubenswrapper[4724]: I1002 13:01:32.114107 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 13:01:32 crc kubenswrapper[4724]: I1002 13:01:32.114155 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pd6zv\" (UniqueName: \"kubernetes.io/projected/59bd4d63-57bf-4b08-b7f1-f0c7d733e571-kube-api-access-pd6zv\") pod \"59bd4d63-57bf-4b08-b7f1-f0c7d733e571\" (UID: \"59bd4d63-57bf-4b08-b7f1-f0c7d733e571\") " Oct 02 13:01:32 crc kubenswrapper[4724]: I1002 13:01:32.114226 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/59bd4d63-57bf-4b08-b7f1-f0c7d733e571-config-volume\") pod \"59bd4d63-57bf-4b08-b7f1-f0c7d733e571\" (UID: \"59bd4d63-57bf-4b08-b7f1-f0c7d733e571\") " Oct 02 13:01:32 crc kubenswrapper[4724]: E1002 13:01:32.114288 4724 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 13:01:32.614265241 +0000 UTC m=+157.069024362 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 13:01:32 crc kubenswrapper[4724]: I1002 13:01:32.114431 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/ca0192d8-a2a6-49b7-badb-253a4ec414e6-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"ca0192d8-a2a6-49b7-badb-253a4ec414e6\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 02 13:01:32 crc kubenswrapper[4724]: I1002 13:01:32.114510 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-v5w5d\" (UID: \"7de6408f-76b4-4a9a-bf83-fe6c4e60848e\") " pod="openshift-image-registry/image-registry-697d97f7c8-v5w5d" Oct 02 13:01:32 crc kubenswrapper[4724]: I1002 13:01:32.114625 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ca0192d8-a2a6-49b7-badb-253a4ec414e6-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"ca0192d8-a2a6-49b7-badb-253a4ec414e6\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 02 13:01:32 crc kubenswrapper[4724]: I1002 13:01:32.114672 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/ca0192d8-a2a6-49b7-badb-253a4ec414e6-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"ca0192d8-a2a6-49b7-badb-253a4ec414e6\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 02 13:01:32 crc kubenswrapper[4724]: I1002 13:01:32.114855 4724 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/59bd4d63-57bf-4b08-b7f1-f0c7d733e571-config-volume" (OuterVolumeSpecName: "config-volume") pod "59bd4d63-57bf-4b08-b7f1-f0c7d733e571" (UID: "59bd4d63-57bf-4b08-b7f1-f0c7d733e571"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 13:01:32 crc kubenswrapper[4724]: E1002 13:01:32.114969 4724 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 13:01:32.614958069 +0000 UTC m=+157.069717190 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-v5w5d" (UID: "7de6408f-76b4-4a9a-bf83-fe6c4e60848e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 13:01:32 crc kubenswrapper[4724]: I1002 13:01:32.118501 4724 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/59bd4d63-57bf-4b08-b7f1-f0c7d733e571-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "59bd4d63-57bf-4b08-b7f1-f0c7d733e571" (UID: "59bd4d63-57bf-4b08-b7f1-f0c7d733e571"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 13:01:32 crc kubenswrapper[4724]: I1002 13:01:32.120143 4724 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/59bd4d63-57bf-4b08-b7f1-f0c7d733e571-kube-api-access-pd6zv" (OuterVolumeSpecName: "kube-api-access-pd6zv") pod "59bd4d63-57bf-4b08-b7f1-f0c7d733e571" (UID: "59bd4d63-57bf-4b08-b7f1-f0c7d733e571"). InnerVolumeSpecName "kube-api-access-pd6zv". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 13:01:32 crc kubenswrapper[4724]: I1002 13:01:32.136330 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ca0192d8-a2a6-49b7-badb-253a4ec414e6-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"ca0192d8-a2a6-49b7-badb-253a4ec414e6\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 02 13:01:32 crc kubenswrapper[4724]: I1002 13:01:32.215445 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 13:01:32 crc kubenswrapper[4724]: I1002 13:01:32.215693 4724 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/59bd4d63-57bf-4b08-b7f1-f0c7d733e571-secret-volume\") on node \"crc\" DevicePath \"\"" Oct 02 13:01:32 crc kubenswrapper[4724]: I1002 13:01:32.215705 4724 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pd6zv\" (UniqueName: \"kubernetes.io/projected/59bd4d63-57bf-4b08-b7f1-f0c7d733e571-kube-api-access-pd6zv\") on node \"crc\" DevicePath \"\"" Oct 02 13:01:32 crc kubenswrapper[4724]: I1002 13:01:32.215714 4724 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/59bd4d63-57bf-4b08-b7f1-f0c7d733e571-config-volume\") on node \"crc\" DevicePath \"\"" Oct 02 13:01:32 crc kubenswrapper[4724]: E1002 13:01:32.216092 4724 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 13:01:32.716073807 +0000 UTC m=+157.170832928 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 13:01:32 crc kubenswrapper[4724]: I1002 13:01:32.225684 4724 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock" Oct 02 13:01:32 crc kubenswrapper[4724]: I1002 13:01:32.243754 4724 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-5mv8k"] Oct 02 13:01:32 crc kubenswrapper[4724]: W1002 13:01:32.249763 4724 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod77faf449_b0a3_48ee_b35b_48bc77531443.slice/crio-4191e6853f8e0008b5fe074a0ee0734e2267df49c3a2f3eeaabb0c681b5d61d7 WatchSource:0}: Error finding container 4191e6853f8e0008b5fe074a0ee0734e2267df49c3a2f3eeaabb0c681b5d61d7: Status 404 returned error can't find the container with id 4191e6853f8e0008b5fe074a0ee0734e2267df49c3a2f3eeaabb0c681b5d61d7 Oct 02 13:01:32 crc kubenswrapper[4724]: I1002 13:01:32.284788 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 02 13:01:32 crc kubenswrapper[4724]: I1002 13:01:32.317205 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-v5w5d\" (UID: \"7de6408f-76b4-4a9a-bf83-fe6c4e60848e\") " pod="openshift-image-registry/image-registry-697d97f7c8-v5w5d" Oct 02 13:01:32 crc kubenswrapper[4724]: E1002 13:01:32.317643 4724 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 13:01:32.817621926 +0000 UTC m=+157.272381097 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-v5w5d" (UID: "7de6408f-76b4-4a9a-bf83-fe6c4e60848e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 13:01:32 crc kubenswrapper[4724]: I1002 13:01:32.418184 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 13:01:32 crc kubenswrapper[4724]: E1002 13:01:32.418403 4724 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 13:01:32.918373395 +0000 UTC m=+157.373132506 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 13:01:32 crc kubenswrapper[4724]: I1002 13:01:32.418551 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-v5w5d\" (UID: \"7de6408f-76b4-4a9a-bf83-fe6c4e60848e\") " pod="openshift-image-registry/image-registry-697d97f7c8-v5w5d" Oct 02 13:01:32 crc kubenswrapper[4724]: E1002 13:01:32.419019 4724 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 13:01:32.919011532 +0000 UTC m=+157.373770653 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-v5w5d" (UID: "7de6408f-76b4-4a9a-bf83-fe6c4e60848e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 13:01:32 crc kubenswrapper[4724]: I1002 13:01:32.471050 4724 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Oct 02 13:01:32 crc kubenswrapper[4724]: W1002 13:01:32.479067 4724 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-podca0192d8_a2a6_49b7_badb_253a4ec414e6.slice/crio-1d635c2a3bd640a7ffa9242311863bfb7fe6beb0d354543516f2f6af7b9ab5e9 WatchSource:0}: Error finding container 1d635c2a3bd640a7ffa9242311863bfb7fe6beb0d354543516f2f6af7b9ab5e9: Status 404 returned error can't find the container with id 1d635c2a3bd640a7ffa9242311863bfb7fe6beb0d354543516f2f6af7b9ab5e9 Oct 02 13:01:32 crc kubenswrapper[4724]: I1002 13:01:32.497404 4724 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock","Timestamp":"2025-10-02T13:01:32.225711161Z","Handler":null,"Name":""} Oct 02 13:01:32 crc kubenswrapper[4724]: I1002 13:01:32.502069 4724 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: kubevirt.io.hostpath-provisioner endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock versions: 1.0.0 Oct 02 13:01:32 crc kubenswrapper[4724]: I1002 13:01:32.502104 4724 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: kubevirt.io.hostpath-provisioner at endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock Oct 02 13:01:32 crc kubenswrapper[4724]: I1002 13:01:32.520344 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 13:01:32 crc kubenswrapper[4724]: I1002 13:01:32.523825 4724 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Oct 02 13:01:32 crc kubenswrapper[4724]: I1002 13:01:32.546332 4724 patch_prober.go:28] interesting pod/router-default-5444994796-n5rln container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 02 13:01:32 crc kubenswrapper[4724]: [-]has-synced failed: reason withheld Oct 02 13:01:32 crc kubenswrapper[4724]: [+]process-running ok Oct 02 13:01:32 crc kubenswrapper[4724]: healthz check failed Oct 02 13:01:32 crc kubenswrapper[4724]: I1002 13:01:32.546382 4724 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-n5rln" podUID="6578bd1a-eaad-452a-adf5-7f3e34838677" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 02 13:01:32 crc kubenswrapper[4724]: I1002 13:01:32.622226 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-v5w5d\" (UID: \"7de6408f-76b4-4a9a-bf83-fe6c4e60848e\") " pod="openshift-image-registry/image-registry-697d97f7c8-v5w5d" Oct 02 13:01:32 crc kubenswrapper[4724]: I1002 13:01:32.758245 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-cwvv5" event={"ID":"c8812b3c-bd75-49a8-b2a7-3db91675fc09","Type":"ContainerStarted","Data":"20ea46286150a33a086135c5fce8f63224ab4a6ec295b6e5635aa5510c784802"} Oct 02 13:01:32 crc kubenswrapper[4724]: I1002 13:01:32.759825 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29323500-vz2xt" event={"ID":"59bd4d63-57bf-4b08-b7f1-f0c7d733e571","Type":"ContainerDied","Data":"a524d6b446d7b90cbd2429b18f49ceb830c84f9639ea535593aea7925d92ffe2"} Oct 02 13:01:32 crc kubenswrapper[4724]: I1002 13:01:32.759875 4724 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a524d6b446d7b90cbd2429b18f49ceb830c84f9639ea535593aea7925d92ffe2" Oct 02 13:01:32 crc kubenswrapper[4724]: I1002 13:01:32.760010 4724 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29323500-vz2xt" Oct 02 13:01:32 crc kubenswrapper[4724]: I1002 13:01:32.761004 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"ca0192d8-a2a6-49b7-badb-253a4ec414e6","Type":"ContainerStarted","Data":"1d635c2a3bd640a7ffa9242311863bfb7fe6beb0d354543516f2f6af7b9ab5e9"} Oct 02 13:01:32 crc kubenswrapper[4724]: I1002 13:01:32.762889 4724 generic.go:334] "Generic (PLEG): container finished" podID="382d6bf1-1008-4b35-a7d2-fee3a1df7191" containerID="8673fdf725b29f43e6bcdaa68e8a5f2a24c2b78417df21a5234352edd816fc50" exitCode=0 Oct 02 13:01:32 crc kubenswrapper[4724]: I1002 13:01:32.763052 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gffjg" event={"ID":"382d6bf1-1008-4b35-a7d2-fee3a1df7191","Type":"ContainerDied","Data":"8673fdf725b29f43e6bcdaa68e8a5f2a24c2b78417df21a5234352edd816fc50"} Oct 02 13:01:32 crc kubenswrapper[4724]: I1002 13:01:32.765661 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5mv8k" event={"ID":"77faf449-b0a3-48ee-b35b-48bc77531443","Type":"ContainerStarted","Data":"4191e6853f8e0008b5fe074a0ee0734e2267df49c3a2f3eeaabb0c681b5d61d7"} Oct 02 13:01:32 crc kubenswrapper[4724]: I1002 13:01:32.768331 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kpsxg" event={"ID":"b3d38739-e797-4866-aebe-290d90535c73","Type":"ContainerStarted","Data":"af41568fa000c117680a8c24b8be765baac88ed70e3cd427dc74304a1a794402"} Oct 02 13:01:32 crc kubenswrapper[4724]: I1002 13:01:32.770253 4724 generic.go:334] "Generic (PLEG): container finished" podID="e0256dd2-79a4-46fb-8698-ea99d23a67de" containerID="e5b878368cc85fc83d3bc35b26ebee188ae017d2e72c49fab10c6a3cc0c43b86" exitCode=0 Oct 02 13:01:32 crc kubenswrapper[4724]: I1002 13:01:32.771308 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-sqwtq" event={"ID":"e0256dd2-79a4-46fb-8698-ea99d23a67de","Type":"ContainerDied","Data":"e5b878368cc85fc83d3bc35b26ebee188ae017d2e72c49fab10c6a3cc0c43b86"} Oct 02 13:01:32 crc kubenswrapper[4724]: I1002 13:01:32.977758 4724 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 02 13:01:33 crc kubenswrapper[4724]: I1002 13:01:33.131140 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/75ffdd90-5d3e-419b-a17d-5ffced74428b-kube-api-access\") pod \"75ffdd90-5d3e-419b-a17d-5ffced74428b\" (UID: \"75ffdd90-5d3e-419b-a17d-5ffced74428b\") " Oct 02 13:01:33 crc kubenswrapper[4724]: I1002 13:01:33.131203 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/75ffdd90-5d3e-419b-a17d-5ffced74428b-kubelet-dir\") pod \"75ffdd90-5d3e-419b-a17d-5ffced74428b\" (UID: \"75ffdd90-5d3e-419b-a17d-5ffced74428b\") " Oct 02 13:01:33 crc kubenswrapper[4724]: I1002 13:01:33.131510 4724 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/75ffdd90-5d3e-419b-a17d-5ffced74428b-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "75ffdd90-5d3e-419b-a17d-5ffced74428b" (UID: "75ffdd90-5d3e-419b-a17d-5ffced74428b"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 02 13:01:33 crc kubenswrapper[4724]: I1002 13:01:33.136741 4724 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/75ffdd90-5d3e-419b-a17d-5ffced74428b-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "75ffdd90-5d3e-419b-a17d-5ffced74428b" (UID: "75ffdd90-5d3e-419b-a17d-5ffced74428b"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 13:01:33 crc kubenswrapper[4724]: I1002 13:01:33.156565 4724 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Oct 02 13:01:33 crc kubenswrapper[4724]: I1002 13:01:33.156641 4724 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-v5w5d\" (UID: \"7de6408f-76b4-4a9a-bf83-fe6c4e60848e\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount\"" pod="openshift-image-registry/image-registry-697d97f7c8-v5w5d" Oct 02 13:01:33 crc kubenswrapper[4724]: I1002 13:01:33.232795 4724 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/75ffdd90-5d3e-419b-a17d-5ffced74428b-kube-api-access\") on node \"crc\" DevicePath \"\"" Oct 02 13:01:33 crc kubenswrapper[4724]: I1002 13:01:33.232840 4724 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/75ffdd90-5d3e-419b-a17d-5ffced74428b-kubelet-dir\") on node \"crc\" DevicePath \"\"" Oct 02 13:01:33 crc kubenswrapper[4724]: I1002 13:01:33.338029 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-v5w5d\" (UID: \"7de6408f-76b4-4a9a-bf83-fe6c4e60848e\") " pod="openshift-image-registry/image-registry-697d97f7c8-v5w5d" Oct 02 13:01:33 crc kubenswrapper[4724]: I1002 13:01:33.513045 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-v5w5d" Oct 02 13:01:33 crc kubenswrapper[4724]: I1002 13:01:33.548204 4724 patch_prober.go:28] interesting pod/router-default-5444994796-n5rln container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 02 13:01:33 crc kubenswrapper[4724]: [-]has-synced failed: reason withheld Oct 02 13:01:33 crc kubenswrapper[4724]: [+]process-running ok Oct 02 13:01:33 crc kubenswrapper[4724]: healthz check failed Oct 02 13:01:33 crc kubenswrapper[4724]: I1002 13:01:33.548271 4724 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-n5rln" podUID="6578bd1a-eaad-452a-adf5-7f3e34838677" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 02 13:01:33 crc kubenswrapper[4724]: I1002 13:01:33.711632 4724 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-v5w5d"] Oct 02 13:01:33 crc kubenswrapper[4724]: I1002 13:01:33.780978 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kpsxg" event={"ID":"b3d38739-e797-4866-aebe-290d90535c73","Type":"ContainerStarted","Data":"37fed1d5a662fe6c9c70f4d8be320e774dcb243fcd96aca6c1797d9e7b2d6668"} Oct 02 13:01:33 crc kubenswrapper[4724]: I1002 13:01:33.781963 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-v5w5d" event={"ID":"7de6408f-76b4-4a9a-bf83-fe6c4e60848e","Type":"ContainerStarted","Data":"2642d328cc74da2067fa8fbdb4e7f4511fbf939d66829b3fc4488c951f2c0fd5"} Oct 02 13:01:33 crc kubenswrapper[4724]: I1002 13:01:33.783435 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"75ffdd90-5d3e-419b-a17d-5ffced74428b","Type":"ContainerDied","Data":"7961cc56e89f61236b362558916f9d837100a801e49cce73ff8a8a39c8865467"} Oct 02 13:01:33 crc kubenswrapper[4724]: I1002 13:01:33.783470 4724 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7961cc56e89f61236b362558916f9d837100a801e49cce73ff8a8a39c8865467" Oct 02 13:01:33 crc kubenswrapper[4724]: I1002 13:01:33.783511 4724 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 02 13:01:34 crc kubenswrapper[4724]: I1002 13:01:34.325670 4724 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f668bae-612b-4b75-9490-919e737c6a3b" path="/var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes" Oct 02 13:01:34 crc kubenswrapper[4724]: I1002 13:01:34.547608 4724 patch_prober.go:28] interesting pod/router-default-5444994796-n5rln container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 02 13:01:34 crc kubenswrapper[4724]: [-]has-synced failed: reason withheld Oct 02 13:01:34 crc kubenswrapper[4724]: [+]process-running ok Oct 02 13:01:34 crc kubenswrapper[4724]: healthz check failed Oct 02 13:01:34 crc kubenswrapper[4724]: I1002 13:01:34.547909 4724 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-n5rln" podUID="6578bd1a-eaad-452a-adf5-7f3e34838677" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 02 13:01:34 crc kubenswrapper[4724]: I1002 13:01:34.733915 4724 patch_prober.go:28] interesting pod/machine-config-daemon-74k4t container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 02 13:01:34 crc kubenswrapper[4724]: I1002 13:01:34.733988 4724 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-74k4t" podUID="f6090eaa-c182-4788-950c-16352c271233" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 02 13:01:34 crc kubenswrapper[4724]: I1002 13:01:34.793610 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-cwvv5" event={"ID":"c8812b3c-bd75-49a8-b2a7-3db91675fc09","Type":"ContainerStarted","Data":"6ed81a90401fa9257fd910ba930a7f88d86617746b1f9e0b8aaa45a52b1f00d7"} Oct 02 13:01:34 crc kubenswrapper[4724]: I1002 13:01:34.797716 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"ca0192d8-a2a6-49b7-badb-253a4ec414e6","Type":"ContainerStarted","Data":"c88bba4cf06910fb30d212e78e70ce8050da1a475d64205b3577857c7f656565"} Oct 02 13:01:34 crc kubenswrapper[4724]: I1002 13:01:34.801508 4724 generic.go:334] "Generic (PLEG): container finished" podID="77faf449-b0a3-48ee-b35b-48bc77531443" containerID="fceaf2f7c19317b58ac427280617e4f14b467758c373158918ec051cf1996aee" exitCode=0 Oct 02 13:01:34 crc kubenswrapper[4724]: I1002 13:01:34.801577 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5mv8k" event={"ID":"77faf449-b0a3-48ee-b35b-48bc77531443","Type":"ContainerDied","Data":"fceaf2f7c19317b58ac427280617e4f14b467758c373158918ec051cf1996aee"} Oct 02 13:01:34 crc kubenswrapper[4724]: I1002 13:01:34.819408 4724 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="hostpath-provisioner/csi-hostpathplugin-cwvv5" podStartSLOduration=16.8193781 podStartE2EDuration="16.8193781s" podCreationTimestamp="2025-10-02 13:01:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 13:01:34.816528668 +0000 UTC m=+159.271287799" watchObservedRunningTime="2025-10-02 13:01:34.8193781 +0000 UTC m=+159.274137221" Oct 02 13:01:34 crc kubenswrapper[4724]: I1002 13:01:34.823908 4724 generic.go:334] "Generic (PLEG): container finished" podID="b3d38739-e797-4866-aebe-290d90535c73" containerID="37fed1d5a662fe6c9c70f4d8be320e774dcb243fcd96aca6c1797d9e7b2d6668" exitCode=0 Oct 02 13:01:34 crc kubenswrapper[4724]: I1002 13:01:34.824448 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kpsxg" event={"ID":"b3d38739-e797-4866-aebe-290d90535c73","Type":"ContainerDied","Data":"37fed1d5a662fe6c9c70f4d8be320e774dcb243fcd96aca6c1797d9e7b2d6668"} Oct 02 13:01:34 crc kubenswrapper[4724]: I1002 13:01:34.829934 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-v5w5d" event={"ID":"7de6408f-76b4-4a9a-bf83-fe6c4e60848e","Type":"ContainerStarted","Data":"85a33b0ddf97fe066a65853af2bda8fa37d284b8b13340b4f068a7fe1e48127c"} Oct 02 13:01:34 crc kubenswrapper[4724]: I1002 13:01:34.829982 4724 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-697d97f7c8-v5w5d" Oct 02 13:01:34 crc kubenswrapper[4724]: I1002 13:01:34.864392 4724 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/revision-pruner-8-crc" podStartSLOduration=3.864371608 podStartE2EDuration="3.864371608s" podCreationTimestamp="2025-10-02 13:01:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 13:01:34.861449674 +0000 UTC m=+159.316208795" watchObservedRunningTime="2025-10-02 13:01:34.864371608 +0000 UTC m=+159.319130729" Oct 02 13:01:34 crc kubenswrapper[4724]: I1002 13:01:34.905702 4724 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-697d97f7c8-v5w5d" podStartSLOduration=134.905681603 podStartE2EDuration="2m14.905681603s" podCreationTimestamp="2025-10-02 12:59:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 13:01:34.90161136 +0000 UTC m=+159.356370481" watchObservedRunningTime="2025-10-02 13:01:34.905681603 +0000 UTC m=+159.360440724" Oct 02 13:01:35 crc kubenswrapper[4724]: I1002 13:01:35.054506 4724 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-apiserver/apiserver-76f77b778f-xjrx9" Oct 02 13:01:35 crc kubenswrapper[4724]: I1002 13:01:35.061601 4724 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-apiserver/apiserver-76f77b778f-xjrx9" Oct 02 13:01:35 crc kubenswrapper[4724]: I1002 13:01:35.546910 4724 patch_prober.go:28] interesting pod/router-default-5444994796-n5rln container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 02 13:01:35 crc kubenswrapper[4724]: [-]has-synced failed: reason withheld Oct 02 13:01:35 crc kubenswrapper[4724]: [+]process-running ok Oct 02 13:01:35 crc kubenswrapper[4724]: healthz check failed Oct 02 13:01:35 crc kubenswrapper[4724]: I1002 13:01:35.547005 4724 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-n5rln" podUID="6578bd1a-eaad-452a-adf5-7f3e34838677" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 02 13:01:35 crc kubenswrapper[4724]: I1002 13:01:35.841819 4724 generic.go:334] "Generic (PLEG): container finished" podID="ca0192d8-a2a6-49b7-badb-253a4ec414e6" containerID="c88bba4cf06910fb30d212e78e70ce8050da1a475d64205b3577857c7f656565" exitCode=0 Oct 02 13:01:35 crc kubenswrapper[4724]: I1002 13:01:35.842820 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"ca0192d8-a2a6-49b7-badb-253a4ec414e6","Type":"ContainerDied","Data":"c88bba4cf06910fb30d212e78e70ce8050da1a475d64205b3577857c7f656565"} Oct 02 13:01:36 crc kubenswrapper[4724]: I1002 13:01:36.546776 4724 patch_prober.go:28] interesting pod/router-default-5444994796-n5rln container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 02 13:01:36 crc kubenswrapper[4724]: [-]has-synced failed: reason withheld Oct 02 13:01:36 crc kubenswrapper[4724]: [+]process-running ok Oct 02 13:01:36 crc kubenswrapper[4724]: healthz check failed Oct 02 13:01:36 crc kubenswrapper[4724]: I1002 13:01:36.547104 4724 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-n5rln" podUID="6578bd1a-eaad-452a-adf5-7f3e34838677" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 02 13:01:36 crc kubenswrapper[4724]: I1002 13:01:36.965785 4724 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-tq9w6" Oct 02 13:01:37 crc kubenswrapper[4724]: I1002 13:01:37.145304 4724 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 02 13:01:37 crc kubenswrapper[4724]: I1002 13:01:37.299827 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/ca0192d8-a2a6-49b7-badb-253a4ec414e6-kubelet-dir\") pod \"ca0192d8-a2a6-49b7-badb-253a4ec414e6\" (UID: \"ca0192d8-a2a6-49b7-badb-253a4ec414e6\") " Oct 02 13:01:37 crc kubenswrapper[4724]: I1002 13:01:37.299952 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ca0192d8-a2a6-49b7-badb-253a4ec414e6-kube-api-access\") pod \"ca0192d8-a2a6-49b7-badb-253a4ec414e6\" (UID: \"ca0192d8-a2a6-49b7-badb-253a4ec414e6\") " Oct 02 13:01:37 crc kubenswrapper[4724]: I1002 13:01:37.299954 4724 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ca0192d8-a2a6-49b7-badb-253a4ec414e6-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "ca0192d8-a2a6-49b7-badb-253a4ec414e6" (UID: "ca0192d8-a2a6-49b7-badb-253a4ec414e6"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 02 13:01:37 crc kubenswrapper[4724]: I1002 13:01:37.300404 4724 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/ca0192d8-a2a6-49b7-badb-253a4ec414e6-kubelet-dir\") on node \"crc\" DevicePath \"\"" Oct 02 13:01:37 crc kubenswrapper[4724]: I1002 13:01:37.307699 4724 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ca0192d8-a2a6-49b7-badb-253a4ec414e6-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "ca0192d8-a2a6-49b7-badb-253a4ec414e6" (UID: "ca0192d8-a2a6-49b7-badb-253a4ec414e6"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 13:01:37 crc kubenswrapper[4724]: I1002 13:01:37.401960 4724 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ca0192d8-a2a6-49b7-badb-253a4ec414e6-kube-api-access\") on node \"crc\" DevicePath \"\"" Oct 02 13:01:37 crc kubenswrapper[4724]: I1002 13:01:37.550957 4724 patch_prober.go:28] interesting pod/router-default-5444994796-n5rln container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 02 13:01:37 crc kubenswrapper[4724]: [-]has-synced failed: reason withheld Oct 02 13:01:37 crc kubenswrapper[4724]: [+]process-running ok Oct 02 13:01:37 crc kubenswrapper[4724]: healthz check failed Oct 02 13:01:37 crc kubenswrapper[4724]: I1002 13:01:37.551051 4724 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-n5rln" podUID="6578bd1a-eaad-452a-adf5-7f3e34838677" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 02 13:01:37 crc kubenswrapper[4724]: I1002 13:01:37.856168 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"ca0192d8-a2a6-49b7-badb-253a4ec414e6","Type":"ContainerDied","Data":"1d635c2a3bd640a7ffa9242311863bfb7fe6beb0d354543516f2f6af7b9ab5e9"} Oct 02 13:01:37 crc kubenswrapper[4724]: I1002 13:01:37.856205 4724 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1d635c2a3bd640a7ffa9242311863bfb7fe6beb0d354543516f2f6af7b9ab5e9" Oct 02 13:01:37 crc kubenswrapper[4724]: I1002 13:01:37.856258 4724 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 02 13:01:38 crc kubenswrapper[4724]: I1002 13:01:38.547056 4724 patch_prober.go:28] interesting pod/router-default-5444994796-n5rln container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 02 13:01:38 crc kubenswrapper[4724]: [-]has-synced failed: reason withheld Oct 02 13:01:38 crc kubenswrapper[4724]: [+]process-running ok Oct 02 13:01:38 crc kubenswrapper[4724]: healthz check failed Oct 02 13:01:38 crc kubenswrapper[4724]: I1002 13:01:38.547129 4724 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-n5rln" podUID="6578bd1a-eaad-452a-adf5-7f3e34838677" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 02 13:01:39 crc kubenswrapper[4724]: I1002 13:01:39.547079 4724 patch_prober.go:28] interesting pod/router-default-5444994796-n5rln container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 02 13:01:39 crc kubenswrapper[4724]: [-]has-synced failed: reason withheld Oct 02 13:01:39 crc kubenswrapper[4724]: [+]process-running ok Oct 02 13:01:39 crc kubenswrapper[4724]: healthz check failed Oct 02 13:01:39 crc kubenswrapper[4724]: I1002 13:01:39.547148 4724 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-n5rln" podUID="6578bd1a-eaad-452a-adf5-7f3e34838677" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 02 13:01:40 crc kubenswrapper[4724]: I1002 13:01:40.546219 4724 patch_prober.go:28] interesting pod/router-default-5444994796-n5rln container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 02 13:01:40 crc kubenswrapper[4724]: [-]has-synced failed: reason withheld Oct 02 13:01:40 crc kubenswrapper[4724]: [+]process-running ok Oct 02 13:01:40 crc kubenswrapper[4724]: healthz check failed Oct 02 13:01:40 crc kubenswrapper[4724]: I1002 13:01:40.546570 4724 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-n5rln" podUID="6578bd1a-eaad-452a-adf5-7f3e34838677" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 02 13:01:40 crc kubenswrapper[4724]: I1002 13:01:40.836250 4724 patch_prober.go:28] interesting pod/downloads-7954f5f757-kl4xt container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.12:8080/\": dial tcp 10.217.0.12:8080: connect: connection refused" start-of-body= Oct 02 13:01:40 crc kubenswrapper[4724]: I1002 13:01:40.836300 4724 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-kl4xt" podUID="84a3f9d9-8a1f-45ed-ad6f-3c8eb02738e7" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.12:8080/\": dial tcp 10.217.0.12:8080: connect: connection refused" Oct 02 13:01:40 crc kubenswrapper[4724]: I1002 13:01:40.836349 4724 patch_prober.go:28] interesting pod/downloads-7954f5f757-kl4xt container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.12:8080/\": dial tcp 10.217.0.12:8080: connect: connection refused" start-of-body= Oct 02 13:01:40 crc kubenswrapper[4724]: I1002 13:01:40.836409 4724 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-kl4xt" podUID="84a3f9d9-8a1f-45ed-ad6f-3c8eb02738e7" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.12:8080/\": dial tcp 10.217.0.12:8080: connect: connection refused" Oct 02 13:01:41 crc kubenswrapper[4724]: I1002 13:01:41.485236 4724 patch_prober.go:28] interesting pod/console-f9d7485db-lvb24 container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.15:8443/health\": dial tcp 10.217.0.15:8443: connect: connection refused" start-of-body= Oct 02 13:01:41 crc kubenswrapper[4724]: I1002 13:01:41.485297 4724 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-lvb24" podUID="131c7969-a8c8-4cfc-b655-ac3d400fae1b" containerName="console" probeResult="failure" output="Get \"https://10.217.0.15:8443/health\": dial tcp 10.217.0.15:8443: connect: connection refused" Oct 02 13:01:41 crc kubenswrapper[4724]: I1002 13:01:41.545562 4724 patch_prober.go:28] interesting pod/router-default-5444994796-n5rln container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 02 13:01:41 crc kubenswrapper[4724]: [-]has-synced failed: reason withheld Oct 02 13:01:41 crc kubenswrapper[4724]: [+]process-running ok Oct 02 13:01:41 crc kubenswrapper[4724]: healthz check failed Oct 02 13:01:41 crc kubenswrapper[4724]: I1002 13:01:41.545619 4724 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-n5rln" podUID="6578bd1a-eaad-452a-adf5-7f3e34838677" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 02 13:01:42 crc kubenswrapper[4724]: I1002 13:01:42.547179 4724 patch_prober.go:28] interesting pod/router-default-5444994796-n5rln container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 02 13:01:42 crc kubenswrapper[4724]: [-]has-synced failed: reason withheld Oct 02 13:01:42 crc kubenswrapper[4724]: [+]process-running ok Oct 02 13:01:42 crc kubenswrapper[4724]: healthz check failed Oct 02 13:01:42 crc kubenswrapper[4724]: I1002 13:01:42.547236 4724 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-n5rln" podUID="6578bd1a-eaad-452a-adf5-7f3e34838677" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 02 13:01:43 crc kubenswrapper[4724]: I1002 13:01:43.298056 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/32e04071-6b34-4fc0-9783-f346a72fcf99-metrics-certs\") pod \"network-metrics-daemon-q7t2t\" (UID: \"32e04071-6b34-4fc0-9783-f346a72fcf99\") " pod="openshift-multus/network-metrics-daemon-q7t2t" Oct 02 13:01:43 crc kubenswrapper[4724]: I1002 13:01:43.316701 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/32e04071-6b34-4fc0-9783-f346a72fcf99-metrics-certs\") pod \"network-metrics-daemon-q7t2t\" (UID: \"32e04071-6b34-4fc0-9783-f346a72fcf99\") " pod="openshift-multus/network-metrics-daemon-q7t2t" Oct 02 13:01:43 crc kubenswrapper[4724]: I1002 13:01:43.327587 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-q7t2t" Oct 02 13:01:43 crc kubenswrapper[4724]: I1002 13:01:43.546956 4724 patch_prober.go:28] interesting pod/router-default-5444994796-n5rln container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 02 13:01:43 crc kubenswrapper[4724]: [-]has-synced failed: reason withheld Oct 02 13:01:43 crc kubenswrapper[4724]: [+]process-running ok Oct 02 13:01:43 crc kubenswrapper[4724]: healthz check failed Oct 02 13:01:43 crc kubenswrapper[4724]: I1002 13:01:43.547026 4724 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-n5rln" podUID="6578bd1a-eaad-452a-adf5-7f3e34838677" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 02 13:01:44 crc kubenswrapper[4724]: I1002 13:01:44.547845 4724 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-5444994796-n5rln" Oct 02 13:01:44 crc kubenswrapper[4724]: I1002 13:01:44.551634 4724 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-5444994796-n5rln" Oct 02 13:01:50 crc kubenswrapper[4724]: I1002 13:01:50.836971 4724 patch_prober.go:28] interesting pod/downloads-7954f5f757-kl4xt container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.12:8080/\": dial tcp 10.217.0.12:8080: connect: connection refused" start-of-body= Oct 02 13:01:50 crc kubenswrapper[4724]: I1002 13:01:50.837299 4724 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-kl4xt" podUID="84a3f9d9-8a1f-45ed-ad6f-3c8eb02738e7" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.12:8080/\": dial tcp 10.217.0.12:8080: connect: connection refused" Oct 02 13:01:50 crc kubenswrapper[4724]: I1002 13:01:50.837072 4724 patch_prober.go:28] interesting pod/downloads-7954f5f757-kl4xt container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.12:8080/\": dial tcp 10.217.0.12:8080: connect: connection refused" start-of-body= Oct 02 13:01:50 crc kubenswrapper[4724]: I1002 13:01:50.837358 4724 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-console/downloads-7954f5f757-kl4xt" Oct 02 13:01:50 crc kubenswrapper[4724]: I1002 13:01:50.837399 4724 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-kl4xt" podUID="84a3f9d9-8a1f-45ed-ad6f-3c8eb02738e7" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.12:8080/\": dial tcp 10.217.0.12:8080: connect: connection refused" Oct 02 13:01:50 crc kubenswrapper[4724]: I1002 13:01:50.838009 4724 patch_prober.go:28] interesting pod/downloads-7954f5f757-kl4xt container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.12:8080/\": dial tcp 10.217.0.12:8080: connect: connection refused" start-of-body= Oct 02 13:01:50 crc kubenswrapper[4724]: I1002 13:01:50.838048 4724 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-kl4xt" podUID="84a3f9d9-8a1f-45ed-ad6f-3c8eb02738e7" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.12:8080/\": dial tcp 10.217.0.12:8080: connect: connection refused" Oct 02 13:01:50 crc kubenswrapper[4724]: I1002 13:01:50.838187 4724 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="download-server" containerStatusID={"Type":"cri-o","ID":"1f16c7644b51cadbb9df0f1c385a7db415ba695ce93a476baa53667049828ae5"} pod="openshift-console/downloads-7954f5f757-kl4xt" containerMessage="Container download-server failed liveness probe, will be restarted" Oct 02 13:01:50 crc kubenswrapper[4724]: I1002 13:01:50.838313 4724 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/downloads-7954f5f757-kl4xt" podUID="84a3f9d9-8a1f-45ed-ad6f-3c8eb02738e7" containerName="download-server" containerID="cri-o://1f16c7644b51cadbb9df0f1c385a7db415ba695ce93a476baa53667049828ae5" gracePeriod=2 Oct 02 13:01:51 crc kubenswrapper[4724]: I1002 13:01:51.523493 4724 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-f9d7485db-lvb24" Oct 02 13:01:51 crc kubenswrapper[4724]: I1002 13:01:51.530527 4724 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-f9d7485db-lvb24" Oct 02 13:01:51 crc kubenswrapper[4724]: I1002 13:01:51.934105 4724 generic.go:334] "Generic (PLEG): container finished" podID="84a3f9d9-8a1f-45ed-ad6f-3c8eb02738e7" containerID="1f16c7644b51cadbb9df0f1c385a7db415ba695ce93a476baa53667049828ae5" exitCode=0 Oct 02 13:01:51 crc kubenswrapper[4724]: I1002 13:01:51.934188 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-kl4xt" event={"ID":"84a3f9d9-8a1f-45ed-ad6f-3c8eb02738e7","Type":"ContainerDied","Data":"1f16c7644b51cadbb9df0f1c385a7db415ba695ce93a476baa53667049828ae5"} Oct 02 13:01:53 crc kubenswrapper[4724]: I1002 13:01:53.050199 4724 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-q7t2t"] Oct 02 13:01:53 crc kubenswrapper[4724]: I1002 13:01:53.519584 4724 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-697d97f7c8-v5w5d" Oct 02 13:01:54 crc kubenswrapper[4724]: I1002 13:01:54.003356 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-q7t2t" event={"ID":"32e04071-6b34-4fc0-9783-f346a72fcf99","Type":"ContainerStarted","Data":"29d75e46feb94b5e37e97ae963b378d0e1b33aa4ec43baef06fa4cbaf1e8f4ae"} Oct 02 13:01:54 crc kubenswrapper[4724]: E1002 13:01:54.263740 4724 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Oct 02 13:01:54 crc kubenswrapper[4724]: E1002 13:01:54.264441 4724 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-2jzkh,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-jvpvx_openshift-marketplace(a4fcd146-2be5-4691-a6db-4e9ed60b4711): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Oct 02 13:01:54 crc kubenswrapper[4724]: E1002 13:01:54.266059 4724 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-jvpvx" podUID="a4fcd146-2be5-4691-a6db-4e9ed60b4711" Oct 02 13:01:55 crc kubenswrapper[4724]: E1002 13:01:55.373481 4724 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Oct 02 13:01:55 crc kubenswrapper[4724]: E1002 13:01:55.373746 4724 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-whnfx,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-x7mvk_openshift-marketplace(d1808303-74a2-424b-9dd4-64838d28a1c7): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Oct 02 13:01:55 crc kubenswrapper[4724]: E1002 13:01:55.375024 4724 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-x7mvk" podUID="d1808303-74a2-424b-9dd4-64838d28a1c7" Oct 02 13:01:59 crc kubenswrapper[4724]: E1002 13:01:59.090811 4724 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Oct 02 13:01:59 crc kubenswrapper[4724]: E1002 13:01:59.091178 4724 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-bn79v,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-vqs8m_openshift-marketplace(53ecbc18-3d93-4ee9-bd02-e3e99db2a82d): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Oct 02 13:01:59 crc kubenswrapper[4724]: E1002 13:01:59.092331 4724 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-vqs8m" podUID="53ecbc18-3d93-4ee9-bd02-e3e99db2a82d" Oct 02 13:02:00 crc kubenswrapper[4724]: I1002 13:02:00.036282 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-q7t2t" event={"ID":"32e04071-6b34-4fc0-9783-f346a72fcf99","Type":"ContainerStarted","Data":"801ca06c227f517307321f8e9e956193375ca4bcd33f2775cb73483eb70df569"} Oct 02 13:02:00 crc kubenswrapper[4724]: I1002 13:02:00.838382 4724 patch_prober.go:28] interesting pod/downloads-7954f5f757-kl4xt container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.12:8080/\": dial tcp 10.217.0.12:8080: connect: connection refused" start-of-body= Oct 02 13:02:00 crc kubenswrapper[4724]: I1002 13:02:00.838478 4724 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-kl4xt" podUID="84a3f9d9-8a1f-45ed-ad6f-3c8eb02738e7" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.12:8080/\": dial tcp 10.217.0.12:8080: connect: connection refused" Oct 02 13:02:01 crc kubenswrapper[4724]: I1002 13:02:01.640089 4724 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-qbbfj" Oct 02 13:02:04 crc kubenswrapper[4724]: I1002 13:02:04.734074 4724 patch_prober.go:28] interesting pod/machine-config-daemon-74k4t container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 02 13:02:04 crc kubenswrapper[4724]: I1002 13:02:04.735316 4724 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-74k4t" podUID="f6090eaa-c182-4788-950c-16352c271233" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 02 13:02:07 crc kubenswrapper[4724]: E1002 13:02:07.128694 4724 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-vqs8m" podUID="53ecbc18-3d93-4ee9-bd02-e3e99db2a82d" Oct 02 13:02:08 crc kubenswrapper[4724]: I1002 13:02:08.507746 4724 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 13:02:09 crc kubenswrapper[4724]: E1002 13:02:09.025990 4724 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Oct 02 13:02:09 crc kubenswrapper[4724]: E1002 13:02:09.026179 4724 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-cxvjs,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-kpsxg_openshift-marketplace(b3d38739-e797-4866-aebe-290d90535c73): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Oct 02 13:02:09 crc kubenswrapper[4724]: E1002 13:02:09.028783 4724 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-operators-kpsxg" podUID="b3d38739-e797-4866-aebe-290d90535c73" Oct 02 13:02:09 crc kubenswrapper[4724]: E1002 13:02:09.361137 4724 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Oct 02 13:02:09 crc kubenswrapper[4724]: E1002 13:02:09.361313 4724 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-2tzv6,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-5mv8k_openshift-marketplace(77faf449-b0a3-48ee-b35b-48bc77531443): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Oct 02 13:02:09 crc kubenswrapper[4724]: E1002 13:02:09.362579 4724 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-operators-5mv8k" podUID="77faf449-b0a3-48ee-b35b-48bc77531443" Oct 02 13:02:10 crc kubenswrapper[4724]: I1002 13:02:10.836373 4724 patch_prober.go:28] interesting pod/downloads-7954f5f757-kl4xt container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.12:8080/\": dial tcp 10.217.0.12:8080: connect: connection refused" start-of-body= Oct 02 13:02:10 crc kubenswrapper[4724]: I1002 13:02:10.836800 4724 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-kl4xt" podUID="84a3f9d9-8a1f-45ed-ad6f-3c8eb02738e7" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.12:8080/\": dial tcp 10.217.0.12:8080: connect: connection refused" Oct 02 13:02:12 crc kubenswrapper[4724]: E1002 13:02:12.953216 4724 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-kpsxg" podUID="b3d38739-e797-4866-aebe-290d90535c73" Oct 02 13:02:14 crc kubenswrapper[4724]: E1002 13:02:14.904627 4724 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-5mv8k" podUID="77faf449-b0a3-48ee-b35b-48bc77531443" Oct 02 13:02:14 crc kubenswrapper[4724]: E1002 13:02:14.912900 4724 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: writing blob: storing blob to file \"/var/tmp/container_images_storage3418673399/3\": happened during read: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Oct 02 13:02:14 crc kubenswrapper[4724]: E1002 13:02:14.913050 4724 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-vbmm8,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-sqwtq_openshift-marketplace(e0256dd2-79a4-46fb-8698-ea99d23a67de): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: writing blob: storing blob to file \"/var/tmp/container_images_storage3418673399/3\": happened during read: context canceled" logger="UnhandledError" Oct 02 13:02:14 crc kubenswrapper[4724]: E1002 13:02:14.914293 4724 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: writing blob: storing blob to file \\\"/var/tmp/container_images_storage3418673399/3\\\": happened during read: context canceled\"" pod="openshift-marketplace/redhat-marketplace-sqwtq" podUID="e0256dd2-79a4-46fb-8698-ea99d23a67de" Oct 02 13:02:19 crc kubenswrapper[4724]: I1002 13:02:19.133590 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-x7mvk" event={"ID":"d1808303-74a2-424b-9dd4-64838d28a1c7","Type":"ContainerStarted","Data":"c9c1b438e8cf8ec66ad0f7b7e17df718f9ca88abd6cd9615a3d8f747118aa1cf"} Oct 02 13:02:19 crc kubenswrapper[4724]: I1002 13:02:19.135517 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-kl4xt" event={"ID":"84a3f9d9-8a1f-45ed-ad6f-3c8eb02738e7","Type":"ContainerStarted","Data":"ebf81c9d3edad9fa2614eacc436358bd1b34f2bb3522319edd01c09d9f97a2e0"} Oct 02 13:02:19 crc kubenswrapper[4724]: I1002 13:02:19.135728 4724 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/downloads-7954f5f757-kl4xt" Oct 02 13:02:19 crc kubenswrapper[4724]: I1002 13:02:19.136038 4724 patch_prober.go:28] interesting pod/downloads-7954f5f757-kl4xt container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.12:8080/\": dial tcp 10.217.0.12:8080: connect: connection refused" start-of-body= Oct 02 13:02:19 crc kubenswrapper[4724]: I1002 13:02:19.136080 4724 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-kl4xt" podUID="84a3f9d9-8a1f-45ed-ad6f-3c8eb02738e7" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.12:8080/\": dial tcp 10.217.0.12:8080: connect: connection refused" Oct 02 13:02:19 crc kubenswrapper[4724]: I1002 13:02:19.137893 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-q7t2t" event={"ID":"32e04071-6b34-4fc0-9783-f346a72fcf99","Type":"ContainerStarted","Data":"6d25ecd898606be256d93d35699585d283fa6257a344f395fe44e0cb0a700b1f"} Oct 02 13:02:19 crc kubenswrapper[4724]: I1002 13:02:19.140081 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jvpvx" event={"ID":"a4fcd146-2be5-4691-a6db-4e9ed60b4711","Type":"ContainerStarted","Data":"c394e6bbf092187b6b4347139edee7cc204e764a123f5ad1b5826aecf9741748"} Oct 02 13:02:19 crc kubenswrapper[4724]: I1002 13:02:19.142159 4724 generic.go:334] "Generic (PLEG): container finished" podID="d3ea65d9-9080-4ccb-837c-ed218fce942c" containerID="8352ea29c3996ce57649bf1f775e809c6c91f84a4a2ac6f022e96ade3f2b5ea5" exitCode=0 Oct 02 13:02:19 crc kubenswrapper[4724]: I1002 13:02:19.142229 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vbrhf" event={"ID":"d3ea65d9-9080-4ccb-837c-ed218fce942c","Type":"ContainerDied","Data":"8352ea29c3996ce57649bf1f775e809c6c91f84a4a2ac6f022e96ade3f2b5ea5"} Oct 02 13:02:19 crc kubenswrapper[4724]: I1002 13:02:19.144018 4724 generic.go:334] "Generic (PLEG): container finished" podID="382d6bf1-1008-4b35-a7d2-fee3a1df7191" containerID="d598cbc6ca817552f224d0676636ea861ee347051c9832d6ec5efc10bb9f6c79" exitCode=0 Oct 02 13:02:19 crc kubenswrapper[4724]: I1002 13:02:19.144042 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gffjg" event={"ID":"382d6bf1-1008-4b35-a7d2-fee3a1df7191","Type":"ContainerDied","Data":"d598cbc6ca817552f224d0676636ea861ee347051c9832d6ec5efc10bb9f6c79"} Oct 02 13:02:19 crc kubenswrapper[4724]: I1002 13:02:19.192273 4724 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-q7t2t" podStartSLOduration=179.192254123 podStartE2EDuration="2m59.192254123s" podCreationTimestamp="2025-10-02 12:59:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 13:02:19.188700023 +0000 UTC m=+203.643459144" watchObservedRunningTime="2025-10-02 13:02:19.192254123 +0000 UTC m=+203.647013244" Oct 02 13:02:20 crc kubenswrapper[4724]: I1002 13:02:20.154256 4724 generic.go:334] "Generic (PLEG): container finished" podID="a4fcd146-2be5-4691-a6db-4e9ed60b4711" containerID="c394e6bbf092187b6b4347139edee7cc204e764a123f5ad1b5826aecf9741748" exitCode=0 Oct 02 13:02:20 crc kubenswrapper[4724]: I1002 13:02:20.154346 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jvpvx" event={"ID":"a4fcd146-2be5-4691-a6db-4e9ed60b4711","Type":"ContainerDied","Data":"c394e6bbf092187b6b4347139edee7cc204e764a123f5ad1b5826aecf9741748"} Oct 02 13:02:20 crc kubenswrapper[4724]: I1002 13:02:20.158115 4724 generic.go:334] "Generic (PLEG): container finished" podID="d1808303-74a2-424b-9dd4-64838d28a1c7" containerID="c9c1b438e8cf8ec66ad0f7b7e17df718f9ca88abd6cd9615a3d8f747118aa1cf" exitCode=0 Oct 02 13:02:20 crc kubenswrapper[4724]: I1002 13:02:20.158227 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-x7mvk" event={"ID":"d1808303-74a2-424b-9dd4-64838d28a1c7","Type":"ContainerDied","Data":"c9c1b438e8cf8ec66ad0f7b7e17df718f9ca88abd6cd9615a3d8f747118aa1cf"} Oct 02 13:02:20 crc kubenswrapper[4724]: I1002 13:02:20.158983 4724 patch_prober.go:28] interesting pod/downloads-7954f5f757-kl4xt container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.12:8080/\": dial tcp 10.217.0.12:8080: connect: connection refused" start-of-body= Oct 02 13:02:20 crc kubenswrapper[4724]: I1002 13:02:20.159058 4724 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-kl4xt" podUID="84a3f9d9-8a1f-45ed-ad6f-3c8eb02738e7" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.12:8080/\": dial tcp 10.217.0.12:8080: connect: connection refused" Oct 02 13:02:20 crc kubenswrapper[4724]: I1002 13:02:20.836040 4724 patch_prober.go:28] interesting pod/downloads-7954f5f757-kl4xt container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.12:8080/\": dial tcp 10.217.0.12:8080: connect: connection refused" start-of-body= Oct 02 13:02:20 crc kubenswrapper[4724]: I1002 13:02:20.836379 4724 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-kl4xt" podUID="84a3f9d9-8a1f-45ed-ad6f-3c8eb02738e7" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.12:8080/\": dial tcp 10.217.0.12:8080: connect: connection refused" Oct 02 13:02:20 crc kubenswrapper[4724]: I1002 13:02:20.836046 4724 patch_prober.go:28] interesting pod/downloads-7954f5f757-kl4xt container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.12:8080/\": dial tcp 10.217.0.12:8080: connect: connection refused" start-of-body= Oct 02 13:02:20 crc kubenswrapper[4724]: I1002 13:02:20.836500 4724 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-kl4xt" podUID="84a3f9d9-8a1f-45ed-ad6f-3c8eb02738e7" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.12:8080/\": dial tcp 10.217.0.12:8080: connect: connection refused" Oct 02 13:02:21 crc kubenswrapper[4724]: I1002 13:02:21.167847 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-x7mvk" event={"ID":"d1808303-74a2-424b-9dd4-64838d28a1c7","Type":"ContainerStarted","Data":"451e7e9fdef3df62f41e5fee8b3cb9fe314f91b1b4872d3f76eb90260d7d5241"} Oct 02 13:02:21 crc kubenswrapper[4724]: I1002 13:02:21.170714 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jvpvx" event={"ID":"a4fcd146-2be5-4691-a6db-4e9ed60b4711","Type":"ContainerStarted","Data":"ac92d51792a6d2396fed78f9d50909313d32629b671a79f4ec2d4209acb6aa4d"} Oct 02 13:02:21 crc kubenswrapper[4724]: I1002 13:02:21.176939 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vbrhf" event={"ID":"d3ea65d9-9080-4ccb-837c-ed218fce942c","Type":"ContainerStarted","Data":"5f29ab5a7a5067f5866a2445dd888336a54fe9d5b30aab674e048bbe5577b117"} Oct 02 13:02:21 crc kubenswrapper[4724]: I1002 13:02:21.179234 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gffjg" event={"ID":"382d6bf1-1008-4b35-a7d2-fee3a1df7191","Type":"ContainerStarted","Data":"47d89ece7ae6d263a6533c50d8914c6ac85980c7b9cade19c90760330f95b71f"} Oct 02 13:02:21 crc kubenswrapper[4724]: I1002 13:02:21.189188 4724 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-x7mvk" podStartSLOduration=3.20781826 podStartE2EDuration="53.189173166s" podCreationTimestamp="2025-10-02 13:01:28 +0000 UTC" firstStartedPulling="2025-10-02 13:01:30.727373715 +0000 UTC m=+155.182132836" lastFinishedPulling="2025-10-02 13:02:20.708728621 +0000 UTC m=+205.163487742" observedRunningTime="2025-10-02 13:02:21.188176211 +0000 UTC m=+205.642935332" watchObservedRunningTime="2025-10-02 13:02:21.189173166 +0000 UTC m=+205.643932287" Oct 02 13:02:21 crc kubenswrapper[4724]: I1002 13:02:21.208244 4724 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-jvpvx" podStartSLOduration=3.138295293 podStartE2EDuration="53.208227151s" podCreationTimestamp="2025-10-02 13:01:28 +0000 UTC" firstStartedPulling="2025-10-02 13:01:30.711344149 +0000 UTC m=+155.166103270" lastFinishedPulling="2025-10-02 13:02:20.781276007 +0000 UTC m=+205.236035128" observedRunningTime="2025-10-02 13:02:21.205972683 +0000 UTC m=+205.660731814" watchObservedRunningTime="2025-10-02 13:02:21.208227151 +0000 UTC m=+205.662986272" Oct 02 13:02:21 crc kubenswrapper[4724]: I1002 13:02:21.227860 4724 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-vbrhf" podStartSLOduration=3.302738216 podStartE2EDuration="53.22784157s" podCreationTimestamp="2025-10-02 13:01:28 +0000 UTC" firstStartedPulling="2025-10-02 13:01:30.720119291 +0000 UTC m=+155.174878412" lastFinishedPulling="2025-10-02 13:02:20.645222645 +0000 UTC m=+205.099981766" observedRunningTime="2025-10-02 13:02:21.224285649 +0000 UTC m=+205.679044780" watchObservedRunningTime="2025-10-02 13:02:21.22784157 +0000 UTC m=+205.682600691" Oct 02 13:02:21 crc kubenswrapper[4724]: I1002 13:02:21.245572 4724 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-gffjg" podStartSLOduration=3.252839129 podStartE2EDuration="51.245558601s" podCreationTimestamp="2025-10-02 13:01:30 +0000 UTC" firstStartedPulling="2025-10-02 13:01:32.764319878 +0000 UTC m=+157.219078999" lastFinishedPulling="2025-10-02 13:02:20.75703935 +0000 UTC m=+205.211798471" observedRunningTime="2025-10-02 13:02:21.244022582 +0000 UTC m=+205.698781703" watchObservedRunningTime="2025-10-02 13:02:21.245558601 +0000 UTC m=+205.700317722" Oct 02 13:02:26 crc kubenswrapper[4724]: I1002 13:02:26.203843 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vqs8m" event={"ID":"53ecbc18-3d93-4ee9-bd02-e3e99db2a82d","Type":"ContainerStarted","Data":"efb3cf44d6d4b54e217263186c9e9c56adaa91cd883d6257d3b9e3b7961a6dad"} Oct 02 13:02:27 crc kubenswrapper[4724]: I1002 13:02:27.211350 4724 generic.go:334] "Generic (PLEG): container finished" podID="53ecbc18-3d93-4ee9-bd02-e3e99db2a82d" containerID="efb3cf44d6d4b54e217263186c9e9c56adaa91cd883d6257d3b9e3b7961a6dad" exitCode=0 Oct 02 13:02:27 crc kubenswrapper[4724]: I1002 13:02:27.211386 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vqs8m" event={"ID":"53ecbc18-3d93-4ee9-bd02-e3e99db2a82d","Type":"ContainerDied","Data":"efb3cf44d6d4b54e217263186c9e9c56adaa91cd883d6257d3b9e3b7961a6dad"} Oct 02 13:02:28 crc kubenswrapper[4724]: I1002 13:02:28.470909 4724 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-x7mvk" Oct 02 13:02:28 crc kubenswrapper[4724]: I1002 13:02:28.471218 4724 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-x7mvk" Oct 02 13:02:28 crc kubenswrapper[4724]: I1002 13:02:28.843894 4724 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-jvpvx" Oct 02 13:02:28 crc kubenswrapper[4724]: I1002 13:02:28.844196 4724 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-jvpvx" Oct 02 13:02:29 crc kubenswrapper[4724]: I1002 13:02:29.070658 4724 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-vbrhf" Oct 02 13:02:29 crc kubenswrapper[4724]: I1002 13:02:29.070698 4724 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-vbrhf" Oct 02 13:02:29 crc kubenswrapper[4724]: I1002 13:02:29.233286 4724 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-vbrhf" Oct 02 13:02:29 crc kubenswrapper[4724]: I1002 13:02:29.234331 4724 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-jvpvx" Oct 02 13:02:29 crc kubenswrapper[4724]: I1002 13:02:29.234726 4724 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-x7mvk" Oct 02 13:02:29 crc kubenswrapper[4724]: I1002 13:02:29.286256 4724 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-jvpvx" Oct 02 13:02:29 crc kubenswrapper[4724]: I1002 13:02:29.286333 4724 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-x7mvk" Oct 02 13:02:30 crc kubenswrapper[4724]: I1002 13:02:30.228911 4724 generic.go:334] "Generic (PLEG): container finished" podID="77faf449-b0a3-48ee-b35b-48bc77531443" containerID="f8f0ae206be3eef06674844f49dafa003f0bb5b8cb4e2046df326a73f21b1da2" exitCode=0 Oct 02 13:02:30 crc kubenswrapper[4724]: I1002 13:02:30.229021 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5mv8k" event={"ID":"77faf449-b0a3-48ee-b35b-48bc77531443","Type":"ContainerDied","Data":"f8f0ae206be3eef06674844f49dafa003f0bb5b8cb4e2046df326a73f21b1da2"} Oct 02 13:02:30 crc kubenswrapper[4724]: I1002 13:02:30.231561 4724 generic.go:334] "Generic (PLEG): container finished" podID="e0256dd2-79a4-46fb-8698-ea99d23a67de" containerID="b74c8e167c736678ca22d51c3f655f88015a6cd7139b689f82fbcf8615960a13" exitCode=0 Oct 02 13:02:30 crc kubenswrapper[4724]: I1002 13:02:30.231651 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-sqwtq" event={"ID":"e0256dd2-79a4-46fb-8698-ea99d23a67de","Type":"ContainerDied","Data":"b74c8e167c736678ca22d51c3f655f88015a6cd7139b689f82fbcf8615960a13"} Oct 02 13:02:30 crc kubenswrapper[4724]: I1002 13:02:30.276432 4724 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-vbrhf" Oct 02 13:02:30 crc kubenswrapper[4724]: I1002 13:02:30.656434 4724 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-gffjg" Oct 02 13:02:30 crc kubenswrapper[4724]: I1002 13:02:30.656526 4724 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-gffjg" Oct 02 13:02:30 crc kubenswrapper[4724]: I1002 13:02:30.715896 4724 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-gffjg" Oct 02 13:02:30 crc kubenswrapper[4724]: I1002 13:02:30.842918 4724 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-7954f5f757-kl4xt" Oct 02 13:02:31 crc kubenswrapper[4724]: I1002 13:02:31.238680 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5mv8k" event={"ID":"77faf449-b0a3-48ee-b35b-48bc77531443","Type":"ContainerStarted","Data":"970570f7638d19975b9d63961c920aed8b2a61e176ca80e665a49c08537630c1"} Oct 02 13:02:31 crc kubenswrapper[4724]: I1002 13:02:31.242169 4724 generic.go:334] "Generic (PLEG): container finished" podID="b3d38739-e797-4866-aebe-290d90535c73" containerID="a8cbf453cb0e997f36585f1b90a2df739aaed41783c30af23758218d58accbab" exitCode=0 Oct 02 13:02:31 crc kubenswrapper[4724]: I1002 13:02:31.242274 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kpsxg" event={"ID":"b3d38739-e797-4866-aebe-290d90535c73","Type":"ContainerDied","Data":"a8cbf453cb0e997f36585f1b90a2df739aaed41783c30af23758218d58accbab"} Oct 02 13:02:31 crc kubenswrapper[4724]: I1002 13:02:31.244909 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vqs8m" event={"ID":"53ecbc18-3d93-4ee9-bd02-e3e99db2a82d","Type":"ContainerStarted","Data":"b27dc3cdb4ccad2c21ab78ceb037953bcfe9567e41c01e14e1d80d387123ee01"} Oct 02 13:02:31 crc kubenswrapper[4724]: I1002 13:02:31.247702 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-sqwtq" event={"ID":"e0256dd2-79a4-46fb-8698-ea99d23a67de","Type":"ContainerStarted","Data":"c0b5a5ca2dbc62a5a6ccac3b8da7753c8fccd91fe94b00eeac8fab4707b72028"} Oct 02 13:02:31 crc kubenswrapper[4724]: I1002 13:02:31.259015 4724 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-5mv8k" podStartSLOduration=4.3985313 podStartE2EDuration="1m0.258993565s" podCreationTimestamp="2025-10-02 13:01:31 +0000 UTC" firstStartedPulling="2025-10-02 13:01:34.802477582 +0000 UTC m=+159.257236703" lastFinishedPulling="2025-10-02 13:02:30.662939847 +0000 UTC m=+215.117698968" observedRunningTime="2025-10-02 13:02:31.25686803 +0000 UTC m=+215.711627151" watchObservedRunningTime="2025-10-02 13:02:31.258993565 +0000 UTC m=+215.713752686" Oct 02 13:02:31 crc kubenswrapper[4724]: I1002 13:02:31.286319 4724 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-gffjg" Oct 02 13:02:31 crc kubenswrapper[4724]: I1002 13:02:31.304916 4724 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-vqs8m" podStartSLOduration=4.418642239 podStartE2EDuration="1m3.304897893s" podCreationTimestamp="2025-10-02 13:01:28 +0000 UTC" firstStartedPulling="2025-10-02 13:01:30.733834888 +0000 UTC m=+155.188594009" lastFinishedPulling="2025-10-02 13:02:29.620090542 +0000 UTC m=+214.074849663" observedRunningTime="2025-10-02 13:02:31.285735345 +0000 UTC m=+215.740494466" watchObservedRunningTime="2025-10-02 13:02:31.304897893 +0000 UTC m=+215.759657014" Oct 02 13:02:31 crc kubenswrapper[4724]: I1002 13:02:31.321242 4724 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-sqwtq" podStartSLOduration=5.453201066 podStartE2EDuration="1m1.321222108s" podCreationTimestamp="2025-10-02 13:01:30 +0000 UTC" firstStartedPulling="2025-10-02 13:01:34.837750845 +0000 UTC m=+159.292509966" lastFinishedPulling="2025-10-02 13:02:30.705771887 +0000 UTC m=+215.160531008" observedRunningTime="2025-10-02 13:02:31.320267914 +0000 UTC m=+215.775027035" watchObservedRunningTime="2025-10-02 13:02:31.321222108 +0000 UTC m=+215.775981229" Oct 02 13:02:32 crc kubenswrapper[4724]: I1002 13:02:32.034045 4724 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-5mv8k" Oct 02 13:02:32 crc kubenswrapper[4724]: I1002 13:02:32.034435 4724 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-5mv8k" Oct 02 13:02:32 crc kubenswrapper[4724]: I1002 13:02:32.255717 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kpsxg" event={"ID":"b3d38739-e797-4866-aebe-290d90535c73","Type":"ContainerStarted","Data":"4496ef58a8798a432ef60da6b4fbd1e001ccd3736be9f4bf4b39819b54a99940"} Oct 02 13:02:32 crc kubenswrapper[4724]: I1002 13:02:32.277322 4724 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-kpsxg" podStartSLOduration=4.371719663 podStartE2EDuration="1m1.277300165s" podCreationTimestamp="2025-10-02 13:01:31 +0000 UTC" firstStartedPulling="2025-10-02 13:01:34.827474735 +0000 UTC m=+159.282233866" lastFinishedPulling="2025-10-02 13:02:31.733055247 +0000 UTC m=+216.187814368" observedRunningTime="2025-10-02 13:02:32.276627758 +0000 UTC m=+216.731386879" watchObservedRunningTime="2025-10-02 13:02:32.277300165 +0000 UTC m=+216.732059286" Oct 02 13:02:32 crc kubenswrapper[4724]: I1002 13:02:32.373020 4724 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-jvpvx"] Oct 02 13:02:32 crc kubenswrapper[4724]: I1002 13:02:32.373519 4724 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-jvpvx" podUID="a4fcd146-2be5-4691-a6db-4e9ed60b4711" containerName="registry-server" containerID="cri-o://ac92d51792a6d2396fed78f9d50909313d32629b671a79f4ec2d4209acb6aa4d" gracePeriod=2 Oct 02 13:02:32 crc kubenswrapper[4724]: I1002 13:02:32.573942 4724 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-vbrhf"] Oct 02 13:02:32 crc kubenswrapper[4724]: I1002 13:02:32.574271 4724 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-vbrhf" podUID="d3ea65d9-9080-4ccb-837c-ed218fce942c" containerName="registry-server" containerID="cri-o://5f29ab5a7a5067f5866a2445dd888336a54fe9d5b30aab674e048bbe5577b117" gracePeriod=2 Oct 02 13:02:33 crc kubenswrapper[4724]: I1002 13:02:33.078498 4724 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-5mv8k" podUID="77faf449-b0a3-48ee-b35b-48bc77531443" containerName="registry-server" probeResult="failure" output=< Oct 02 13:02:33 crc kubenswrapper[4724]: timeout: failed to connect service ":50051" within 1s Oct 02 13:02:33 crc kubenswrapper[4724]: > Oct 02 13:02:33 crc kubenswrapper[4724]: I1002 13:02:33.262454 4724 generic.go:334] "Generic (PLEG): container finished" podID="a4fcd146-2be5-4691-a6db-4e9ed60b4711" containerID="ac92d51792a6d2396fed78f9d50909313d32629b671a79f4ec2d4209acb6aa4d" exitCode=0 Oct 02 13:02:33 crc kubenswrapper[4724]: I1002 13:02:33.262560 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jvpvx" event={"ID":"a4fcd146-2be5-4691-a6db-4e9ed60b4711","Type":"ContainerDied","Data":"ac92d51792a6d2396fed78f9d50909313d32629b671a79f4ec2d4209acb6aa4d"} Oct 02 13:02:33 crc kubenswrapper[4724]: I1002 13:02:33.919065 4724 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-vbrhf" Oct 02 13:02:33 crc kubenswrapper[4724]: I1002 13:02:33.981353 4724 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-jvpvx" Oct 02 13:02:34 crc kubenswrapper[4724]: I1002 13:02:34.030527 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d3ea65d9-9080-4ccb-837c-ed218fce942c-catalog-content\") pod \"d3ea65d9-9080-4ccb-837c-ed218fce942c\" (UID: \"d3ea65d9-9080-4ccb-837c-ed218fce942c\") " Oct 02 13:02:34 crc kubenswrapper[4724]: I1002 13:02:34.030596 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-glm8h\" (UniqueName: \"kubernetes.io/projected/d3ea65d9-9080-4ccb-837c-ed218fce942c-kube-api-access-glm8h\") pod \"d3ea65d9-9080-4ccb-837c-ed218fce942c\" (UID: \"d3ea65d9-9080-4ccb-837c-ed218fce942c\") " Oct 02 13:02:34 crc kubenswrapper[4724]: I1002 13:02:34.030705 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d3ea65d9-9080-4ccb-837c-ed218fce942c-utilities\") pod \"d3ea65d9-9080-4ccb-837c-ed218fce942c\" (UID: \"d3ea65d9-9080-4ccb-837c-ed218fce942c\") " Oct 02 13:02:34 crc kubenswrapper[4724]: I1002 13:02:34.031664 4724 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d3ea65d9-9080-4ccb-837c-ed218fce942c-utilities" (OuterVolumeSpecName: "utilities") pod "d3ea65d9-9080-4ccb-837c-ed218fce942c" (UID: "d3ea65d9-9080-4ccb-837c-ed218fce942c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 13:02:34 crc kubenswrapper[4724]: I1002 13:02:34.036704 4724 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d3ea65d9-9080-4ccb-837c-ed218fce942c-kube-api-access-glm8h" (OuterVolumeSpecName: "kube-api-access-glm8h") pod "d3ea65d9-9080-4ccb-837c-ed218fce942c" (UID: "d3ea65d9-9080-4ccb-837c-ed218fce942c"). InnerVolumeSpecName "kube-api-access-glm8h". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 13:02:34 crc kubenswrapper[4724]: I1002 13:02:34.075457 4724 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d3ea65d9-9080-4ccb-837c-ed218fce942c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d3ea65d9-9080-4ccb-837c-ed218fce942c" (UID: "d3ea65d9-9080-4ccb-837c-ed218fce942c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 13:02:34 crc kubenswrapper[4724]: I1002 13:02:34.131862 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a4fcd146-2be5-4691-a6db-4e9ed60b4711-utilities\") pod \"a4fcd146-2be5-4691-a6db-4e9ed60b4711\" (UID: \"a4fcd146-2be5-4691-a6db-4e9ed60b4711\") " Oct 02 13:02:34 crc kubenswrapper[4724]: I1002 13:02:34.131912 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2jzkh\" (UniqueName: \"kubernetes.io/projected/a4fcd146-2be5-4691-a6db-4e9ed60b4711-kube-api-access-2jzkh\") pod \"a4fcd146-2be5-4691-a6db-4e9ed60b4711\" (UID: \"a4fcd146-2be5-4691-a6db-4e9ed60b4711\") " Oct 02 13:02:34 crc kubenswrapper[4724]: I1002 13:02:34.132014 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a4fcd146-2be5-4691-a6db-4e9ed60b4711-catalog-content\") pod \"a4fcd146-2be5-4691-a6db-4e9ed60b4711\" (UID: \"a4fcd146-2be5-4691-a6db-4e9ed60b4711\") " Oct 02 13:02:34 crc kubenswrapper[4724]: I1002 13:02:34.132281 4724 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d3ea65d9-9080-4ccb-837c-ed218fce942c-utilities\") on node \"crc\" DevicePath \"\"" Oct 02 13:02:34 crc kubenswrapper[4724]: I1002 13:02:34.132303 4724 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d3ea65d9-9080-4ccb-837c-ed218fce942c-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 02 13:02:34 crc kubenswrapper[4724]: I1002 13:02:34.132319 4724 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-glm8h\" (UniqueName: \"kubernetes.io/projected/d3ea65d9-9080-4ccb-837c-ed218fce942c-kube-api-access-glm8h\") on node \"crc\" DevicePath \"\"" Oct 02 13:02:34 crc kubenswrapper[4724]: I1002 13:02:34.132715 4724 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a4fcd146-2be5-4691-a6db-4e9ed60b4711-utilities" (OuterVolumeSpecName: "utilities") pod "a4fcd146-2be5-4691-a6db-4e9ed60b4711" (UID: "a4fcd146-2be5-4691-a6db-4e9ed60b4711"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 13:02:34 crc kubenswrapper[4724]: I1002 13:02:34.137905 4724 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a4fcd146-2be5-4691-a6db-4e9ed60b4711-kube-api-access-2jzkh" (OuterVolumeSpecName: "kube-api-access-2jzkh") pod "a4fcd146-2be5-4691-a6db-4e9ed60b4711" (UID: "a4fcd146-2be5-4691-a6db-4e9ed60b4711"). InnerVolumeSpecName "kube-api-access-2jzkh". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 13:02:34 crc kubenswrapper[4724]: I1002 13:02:34.175251 4724 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a4fcd146-2be5-4691-a6db-4e9ed60b4711-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a4fcd146-2be5-4691-a6db-4e9ed60b4711" (UID: "a4fcd146-2be5-4691-a6db-4e9ed60b4711"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 13:02:34 crc kubenswrapper[4724]: I1002 13:02:34.233767 4724 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a4fcd146-2be5-4691-a6db-4e9ed60b4711-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 02 13:02:34 crc kubenswrapper[4724]: I1002 13:02:34.233805 4724 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a4fcd146-2be5-4691-a6db-4e9ed60b4711-utilities\") on node \"crc\" DevicePath \"\"" Oct 02 13:02:34 crc kubenswrapper[4724]: I1002 13:02:34.233817 4724 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2jzkh\" (UniqueName: \"kubernetes.io/projected/a4fcd146-2be5-4691-a6db-4e9ed60b4711-kube-api-access-2jzkh\") on node \"crc\" DevicePath \"\"" Oct 02 13:02:34 crc kubenswrapper[4724]: I1002 13:02:34.269245 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jvpvx" event={"ID":"a4fcd146-2be5-4691-a6db-4e9ed60b4711","Type":"ContainerDied","Data":"71011b27f9cac6dc14ee558a6dd8bba6b9bf3b68a7c036aeab241656c3950f66"} Oct 02 13:02:34 crc kubenswrapper[4724]: I1002 13:02:34.269286 4724 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-jvpvx" Oct 02 13:02:34 crc kubenswrapper[4724]: I1002 13:02:34.269314 4724 scope.go:117] "RemoveContainer" containerID="ac92d51792a6d2396fed78f9d50909313d32629b671a79f4ec2d4209acb6aa4d" Oct 02 13:02:34 crc kubenswrapper[4724]: I1002 13:02:34.284682 4724 generic.go:334] "Generic (PLEG): container finished" podID="d3ea65d9-9080-4ccb-837c-ed218fce942c" containerID="5f29ab5a7a5067f5866a2445dd888336a54fe9d5b30aab674e048bbe5577b117" exitCode=0 Oct 02 13:02:34 crc kubenswrapper[4724]: I1002 13:02:34.284723 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vbrhf" event={"ID":"d3ea65d9-9080-4ccb-837c-ed218fce942c","Type":"ContainerDied","Data":"5f29ab5a7a5067f5866a2445dd888336a54fe9d5b30aab674e048bbe5577b117"} Oct 02 13:02:34 crc kubenswrapper[4724]: I1002 13:02:34.284738 4724 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-vbrhf" Oct 02 13:02:34 crc kubenswrapper[4724]: I1002 13:02:34.284749 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vbrhf" event={"ID":"d3ea65d9-9080-4ccb-837c-ed218fce942c","Type":"ContainerDied","Data":"2845d34547b3f270ab89794d03230275a4cf7551b5d4d47641452164adea77cb"} Oct 02 13:02:34 crc kubenswrapper[4724]: I1002 13:02:34.299625 4724 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-jvpvx"] Oct 02 13:02:34 crc kubenswrapper[4724]: I1002 13:02:34.301566 4724 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-jvpvx"] Oct 02 13:02:34 crc kubenswrapper[4724]: I1002 13:02:34.306596 4724 scope.go:117] "RemoveContainer" containerID="c394e6bbf092187b6b4347139edee7cc204e764a123f5ad1b5826aecf9741748" Oct 02 13:02:34 crc kubenswrapper[4724]: I1002 13:02:34.324697 4724 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a4fcd146-2be5-4691-a6db-4e9ed60b4711" path="/var/lib/kubelet/pods/a4fcd146-2be5-4691-a6db-4e9ed60b4711/volumes" Oct 02 13:02:34 crc kubenswrapper[4724]: I1002 13:02:34.329347 4724 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-vbrhf"] Oct 02 13:02:34 crc kubenswrapper[4724]: I1002 13:02:34.332779 4724 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-vbrhf"] Oct 02 13:02:34 crc kubenswrapper[4724]: I1002 13:02:34.335966 4724 scope.go:117] "RemoveContainer" containerID="d703d45d5eb30e8f518382b3e220995248d4311408013d9d23f1550c4ff2cabe" Oct 02 13:02:34 crc kubenswrapper[4724]: I1002 13:02:34.349129 4724 scope.go:117] "RemoveContainer" containerID="5f29ab5a7a5067f5866a2445dd888336a54fe9d5b30aab674e048bbe5577b117" Oct 02 13:02:34 crc kubenswrapper[4724]: I1002 13:02:34.364636 4724 scope.go:117] "RemoveContainer" containerID="8352ea29c3996ce57649bf1f775e809c6c91f84a4a2ac6f022e96ade3f2b5ea5" Oct 02 13:02:34 crc kubenswrapper[4724]: I1002 13:02:34.383451 4724 scope.go:117] "RemoveContainer" containerID="877923f5b01e3fbc5136808220a3012fe4fd40676b8aa3929dbe6da9dba05319" Oct 02 13:02:34 crc kubenswrapper[4724]: I1002 13:02:34.401387 4724 scope.go:117] "RemoveContainer" containerID="5f29ab5a7a5067f5866a2445dd888336a54fe9d5b30aab674e048bbe5577b117" Oct 02 13:02:34 crc kubenswrapper[4724]: E1002 13:02:34.401974 4724 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5f29ab5a7a5067f5866a2445dd888336a54fe9d5b30aab674e048bbe5577b117\": container with ID starting with 5f29ab5a7a5067f5866a2445dd888336a54fe9d5b30aab674e048bbe5577b117 not found: ID does not exist" containerID="5f29ab5a7a5067f5866a2445dd888336a54fe9d5b30aab674e048bbe5577b117" Oct 02 13:02:34 crc kubenswrapper[4724]: I1002 13:02:34.402015 4724 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5f29ab5a7a5067f5866a2445dd888336a54fe9d5b30aab674e048bbe5577b117"} err="failed to get container status \"5f29ab5a7a5067f5866a2445dd888336a54fe9d5b30aab674e048bbe5577b117\": rpc error: code = NotFound desc = could not find container \"5f29ab5a7a5067f5866a2445dd888336a54fe9d5b30aab674e048bbe5577b117\": container with ID starting with 5f29ab5a7a5067f5866a2445dd888336a54fe9d5b30aab674e048bbe5577b117 not found: ID does not exist" Oct 02 13:02:34 crc kubenswrapper[4724]: I1002 13:02:34.402043 4724 scope.go:117] "RemoveContainer" containerID="8352ea29c3996ce57649bf1f775e809c6c91f84a4a2ac6f022e96ade3f2b5ea5" Oct 02 13:02:34 crc kubenswrapper[4724]: E1002 13:02:34.402403 4724 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8352ea29c3996ce57649bf1f775e809c6c91f84a4a2ac6f022e96ade3f2b5ea5\": container with ID starting with 8352ea29c3996ce57649bf1f775e809c6c91f84a4a2ac6f022e96ade3f2b5ea5 not found: ID does not exist" containerID="8352ea29c3996ce57649bf1f775e809c6c91f84a4a2ac6f022e96ade3f2b5ea5" Oct 02 13:02:34 crc kubenswrapper[4724]: I1002 13:02:34.402430 4724 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8352ea29c3996ce57649bf1f775e809c6c91f84a4a2ac6f022e96ade3f2b5ea5"} err="failed to get container status \"8352ea29c3996ce57649bf1f775e809c6c91f84a4a2ac6f022e96ade3f2b5ea5\": rpc error: code = NotFound desc = could not find container \"8352ea29c3996ce57649bf1f775e809c6c91f84a4a2ac6f022e96ade3f2b5ea5\": container with ID starting with 8352ea29c3996ce57649bf1f775e809c6c91f84a4a2ac6f022e96ade3f2b5ea5 not found: ID does not exist" Oct 02 13:02:34 crc kubenswrapper[4724]: I1002 13:02:34.402442 4724 scope.go:117] "RemoveContainer" containerID="877923f5b01e3fbc5136808220a3012fe4fd40676b8aa3929dbe6da9dba05319" Oct 02 13:02:34 crc kubenswrapper[4724]: E1002 13:02:34.402743 4724 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"877923f5b01e3fbc5136808220a3012fe4fd40676b8aa3929dbe6da9dba05319\": container with ID starting with 877923f5b01e3fbc5136808220a3012fe4fd40676b8aa3929dbe6da9dba05319 not found: ID does not exist" containerID="877923f5b01e3fbc5136808220a3012fe4fd40676b8aa3929dbe6da9dba05319" Oct 02 13:02:34 crc kubenswrapper[4724]: I1002 13:02:34.402777 4724 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"877923f5b01e3fbc5136808220a3012fe4fd40676b8aa3929dbe6da9dba05319"} err="failed to get container status \"877923f5b01e3fbc5136808220a3012fe4fd40676b8aa3929dbe6da9dba05319\": rpc error: code = NotFound desc = could not find container \"877923f5b01e3fbc5136808220a3012fe4fd40676b8aa3929dbe6da9dba05319\": container with ID starting with 877923f5b01e3fbc5136808220a3012fe4fd40676b8aa3929dbe6da9dba05319 not found: ID does not exist" Oct 02 13:02:34 crc kubenswrapper[4724]: I1002 13:02:34.734348 4724 patch_prober.go:28] interesting pod/machine-config-daemon-74k4t container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 02 13:02:34 crc kubenswrapper[4724]: I1002 13:02:34.734435 4724 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-74k4t" podUID="f6090eaa-c182-4788-950c-16352c271233" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 02 13:02:34 crc kubenswrapper[4724]: I1002 13:02:34.734487 4724 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-74k4t" Oct 02 13:02:34 crc kubenswrapper[4724]: I1002 13:02:34.735141 4724 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"4d5dbca156adf53ca2d3b237e15b6dd4387249e73dbf1a7fb3e7e39c53d1b548"} pod="openshift-machine-config-operator/machine-config-daemon-74k4t" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 02 13:02:34 crc kubenswrapper[4724]: I1002 13:02:34.735212 4724 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-74k4t" podUID="f6090eaa-c182-4788-950c-16352c271233" containerName="machine-config-daemon" containerID="cri-o://4d5dbca156adf53ca2d3b237e15b6dd4387249e73dbf1a7fb3e7e39c53d1b548" gracePeriod=600 Oct 02 13:02:36 crc kubenswrapper[4724]: I1002 13:02:36.300484 4724 generic.go:334] "Generic (PLEG): container finished" podID="f6090eaa-c182-4788-950c-16352c271233" containerID="4d5dbca156adf53ca2d3b237e15b6dd4387249e73dbf1a7fb3e7e39c53d1b548" exitCode=0 Oct 02 13:02:36 crc kubenswrapper[4724]: I1002 13:02:36.300607 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-74k4t" event={"ID":"f6090eaa-c182-4788-950c-16352c271233","Type":"ContainerDied","Data":"4d5dbca156adf53ca2d3b237e15b6dd4387249e73dbf1a7fb3e7e39c53d1b548"} Oct 02 13:02:36 crc kubenswrapper[4724]: I1002 13:02:36.321731 4724 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d3ea65d9-9080-4ccb-837c-ed218fce942c" path="/var/lib/kubelet/pods/d3ea65d9-9080-4ccb-837c-ed218fce942c/volumes" Oct 02 13:02:37 crc kubenswrapper[4724]: I1002 13:02:37.308095 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-74k4t" event={"ID":"f6090eaa-c182-4788-950c-16352c271233","Type":"ContainerStarted","Data":"0e1e59e5ee7d2a3679bcc3637d0a4c2bf504931cc1e07c9c6a217c9a76b76895"} Oct 02 13:02:38 crc kubenswrapper[4724]: I1002 13:02:38.676279 4724 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-vqs8m" Oct 02 13:02:38 crc kubenswrapper[4724]: I1002 13:02:38.676662 4724 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-vqs8m" Oct 02 13:02:38 crc kubenswrapper[4724]: I1002 13:02:38.718233 4724 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-vqs8m" Oct 02 13:02:39 crc kubenswrapper[4724]: I1002 13:02:39.356736 4724 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-vqs8m" Oct 02 13:02:41 crc kubenswrapper[4724]: I1002 13:02:41.033185 4724 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-sqwtq" Oct 02 13:02:41 crc kubenswrapper[4724]: I1002 13:02:41.033865 4724 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-sqwtq" Oct 02 13:02:41 crc kubenswrapper[4724]: I1002 13:02:41.070777 4724 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-sqwtq" Oct 02 13:02:41 crc kubenswrapper[4724]: I1002 13:02:41.369839 4724 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-sqwtq" Oct 02 13:02:41 crc kubenswrapper[4724]: I1002 13:02:41.411106 4724 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-sqwtq"] Oct 02 13:02:41 crc kubenswrapper[4724]: I1002 13:02:41.634990 4724 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-kpsxg" Oct 02 13:02:41 crc kubenswrapper[4724]: I1002 13:02:41.635311 4724 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-kpsxg" Oct 02 13:02:41 crc kubenswrapper[4724]: I1002 13:02:41.672088 4724 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-kpsxg" Oct 02 13:02:42 crc kubenswrapper[4724]: I1002 13:02:42.074510 4724 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-5mv8k" Oct 02 13:02:42 crc kubenswrapper[4724]: I1002 13:02:42.119755 4724 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-5mv8k" Oct 02 13:02:42 crc kubenswrapper[4724]: I1002 13:02:42.368305 4724 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-kpsxg" Oct 02 13:02:43 crc kubenswrapper[4724]: I1002 13:02:43.336819 4724 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-sqwtq" podUID="e0256dd2-79a4-46fb-8698-ea99d23a67de" containerName="registry-server" containerID="cri-o://c0b5a5ca2dbc62a5a6ccac3b8da7753c8fccd91fe94b00eeac8fab4707b72028" gracePeriod=2 Oct 02 13:02:43 crc kubenswrapper[4724]: I1002 13:02:43.773972 4724 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-sqwtq" Oct 02 13:02:43 crc kubenswrapper[4724]: I1002 13:02:43.905679 4724 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-5mv8k"] Oct 02 13:02:43 crc kubenswrapper[4724]: I1002 13:02:43.906343 4724 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-5mv8k" podUID="77faf449-b0a3-48ee-b35b-48bc77531443" containerName="registry-server" containerID="cri-o://970570f7638d19975b9d63961c920aed8b2a61e176ca80e665a49c08537630c1" gracePeriod=2 Oct 02 13:02:43 crc kubenswrapper[4724]: I1002 13:02:43.959841 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vbmm8\" (UniqueName: \"kubernetes.io/projected/e0256dd2-79a4-46fb-8698-ea99d23a67de-kube-api-access-vbmm8\") pod \"e0256dd2-79a4-46fb-8698-ea99d23a67de\" (UID: \"e0256dd2-79a4-46fb-8698-ea99d23a67de\") " Oct 02 13:02:43 crc kubenswrapper[4724]: I1002 13:02:43.959916 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e0256dd2-79a4-46fb-8698-ea99d23a67de-catalog-content\") pod \"e0256dd2-79a4-46fb-8698-ea99d23a67de\" (UID: \"e0256dd2-79a4-46fb-8698-ea99d23a67de\") " Oct 02 13:02:43 crc kubenswrapper[4724]: I1002 13:02:43.959949 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e0256dd2-79a4-46fb-8698-ea99d23a67de-utilities\") pod \"e0256dd2-79a4-46fb-8698-ea99d23a67de\" (UID: \"e0256dd2-79a4-46fb-8698-ea99d23a67de\") " Oct 02 13:02:43 crc kubenswrapper[4724]: I1002 13:02:43.960984 4724 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e0256dd2-79a4-46fb-8698-ea99d23a67de-utilities" (OuterVolumeSpecName: "utilities") pod "e0256dd2-79a4-46fb-8698-ea99d23a67de" (UID: "e0256dd2-79a4-46fb-8698-ea99d23a67de"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 13:02:43 crc kubenswrapper[4724]: I1002 13:02:43.966604 4724 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e0256dd2-79a4-46fb-8698-ea99d23a67de-kube-api-access-vbmm8" (OuterVolumeSpecName: "kube-api-access-vbmm8") pod "e0256dd2-79a4-46fb-8698-ea99d23a67de" (UID: "e0256dd2-79a4-46fb-8698-ea99d23a67de"). InnerVolumeSpecName "kube-api-access-vbmm8". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 13:02:43 crc kubenswrapper[4724]: I1002 13:02:43.974847 4724 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e0256dd2-79a4-46fb-8698-ea99d23a67de-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e0256dd2-79a4-46fb-8698-ea99d23a67de" (UID: "e0256dd2-79a4-46fb-8698-ea99d23a67de"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 13:02:44 crc kubenswrapper[4724]: I1002 13:02:44.061418 4724 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vbmm8\" (UniqueName: \"kubernetes.io/projected/e0256dd2-79a4-46fb-8698-ea99d23a67de-kube-api-access-vbmm8\") on node \"crc\" DevicePath \"\"" Oct 02 13:02:44 crc kubenswrapper[4724]: I1002 13:02:44.061458 4724 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e0256dd2-79a4-46fb-8698-ea99d23a67de-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 02 13:02:44 crc kubenswrapper[4724]: I1002 13:02:44.061469 4724 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e0256dd2-79a4-46fb-8698-ea99d23a67de-utilities\") on node \"crc\" DevicePath \"\"" Oct 02 13:02:44 crc kubenswrapper[4724]: I1002 13:02:44.343403 4724 generic.go:334] "Generic (PLEG): container finished" podID="e0256dd2-79a4-46fb-8698-ea99d23a67de" containerID="c0b5a5ca2dbc62a5a6ccac3b8da7753c8fccd91fe94b00eeac8fab4707b72028" exitCode=0 Oct 02 13:02:44 crc kubenswrapper[4724]: I1002 13:02:44.343441 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-sqwtq" event={"ID":"e0256dd2-79a4-46fb-8698-ea99d23a67de","Type":"ContainerDied","Data":"c0b5a5ca2dbc62a5a6ccac3b8da7753c8fccd91fe94b00eeac8fab4707b72028"} Oct 02 13:02:44 crc kubenswrapper[4724]: I1002 13:02:44.343465 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-sqwtq" event={"ID":"e0256dd2-79a4-46fb-8698-ea99d23a67de","Type":"ContainerDied","Data":"b7db912730732343384974a313817912c3ccf840ebc17380fb4a7531d0012ddc"} Oct 02 13:02:44 crc kubenswrapper[4724]: I1002 13:02:44.343471 4724 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-sqwtq" Oct 02 13:02:44 crc kubenswrapper[4724]: I1002 13:02:44.343484 4724 scope.go:117] "RemoveContainer" containerID="c0b5a5ca2dbc62a5a6ccac3b8da7753c8fccd91fe94b00eeac8fab4707b72028" Oct 02 13:02:44 crc kubenswrapper[4724]: I1002 13:02:44.360444 4724 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-sqwtq"] Oct 02 13:02:44 crc kubenswrapper[4724]: I1002 13:02:44.363207 4724 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-sqwtq"] Oct 02 13:02:44 crc kubenswrapper[4724]: I1002 13:02:44.364588 4724 scope.go:117] "RemoveContainer" containerID="b74c8e167c736678ca22d51c3f655f88015a6cd7139b689f82fbcf8615960a13" Oct 02 13:02:44 crc kubenswrapper[4724]: I1002 13:02:44.429594 4724 scope.go:117] "RemoveContainer" containerID="e5b878368cc85fc83d3bc35b26ebee188ae017d2e72c49fab10c6a3cc0c43b86" Oct 02 13:02:44 crc kubenswrapper[4724]: I1002 13:02:44.442322 4724 scope.go:117] "RemoveContainer" containerID="c0b5a5ca2dbc62a5a6ccac3b8da7753c8fccd91fe94b00eeac8fab4707b72028" Oct 02 13:02:44 crc kubenswrapper[4724]: E1002 13:02:44.442756 4724 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c0b5a5ca2dbc62a5a6ccac3b8da7753c8fccd91fe94b00eeac8fab4707b72028\": container with ID starting with c0b5a5ca2dbc62a5a6ccac3b8da7753c8fccd91fe94b00eeac8fab4707b72028 not found: ID does not exist" containerID="c0b5a5ca2dbc62a5a6ccac3b8da7753c8fccd91fe94b00eeac8fab4707b72028" Oct 02 13:02:44 crc kubenswrapper[4724]: I1002 13:02:44.442791 4724 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c0b5a5ca2dbc62a5a6ccac3b8da7753c8fccd91fe94b00eeac8fab4707b72028"} err="failed to get container status \"c0b5a5ca2dbc62a5a6ccac3b8da7753c8fccd91fe94b00eeac8fab4707b72028\": rpc error: code = NotFound desc = could not find container \"c0b5a5ca2dbc62a5a6ccac3b8da7753c8fccd91fe94b00eeac8fab4707b72028\": container with ID starting with c0b5a5ca2dbc62a5a6ccac3b8da7753c8fccd91fe94b00eeac8fab4707b72028 not found: ID does not exist" Oct 02 13:02:44 crc kubenswrapper[4724]: I1002 13:02:44.442815 4724 scope.go:117] "RemoveContainer" containerID="b74c8e167c736678ca22d51c3f655f88015a6cd7139b689f82fbcf8615960a13" Oct 02 13:02:44 crc kubenswrapper[4724]: E1002 13:02:44.443205 4724 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b74c8e167c736678ca22d51c3f655f88015a6cd7139b689f82fbcf8615960a13\": container with ID starting with b74c8e167c736678ca22d51c3f655f88015a6cd7139b689f82fbcf8615960a13 not found: ID does not exist" containerID="b74c8e167c736678ca22d51c3f655f88015a6cd7139b689f82fbcf8615960a13" Oct 02 13:02:44 crc kubenswrapper[4724]: I1002 13:02:44.443225 4724 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b74c8e167c736678ca22d51c3f655f88015a6cd7139b689f82fbcf8615960a13"} err="failed to get container status \"b74c8e167c736678ca22d51c3f655f88015a6cd7139b689f82fbcf8615960a13\": rpc error: code = NotFound desc = could not find container \"b74c8e167c736678ca22d51c3f655f88015a6cd7139b689f82fbcf8615960a13\": container with ID starting with b74c8e167c736678ca22d51c3f655f88015a6cd7139b689f82fbcf8615960a13 not found: ID does not exist" Oct 02 13:02:44 crc kubenswrapper[4724]: I1002 13:02:44.443240 4724 scope.go:117] "RemoveContainer" containerID="e5b878368cc85fc83d3bc35b26ebee188ae017d2e72c49fab10c6a3cc0c43b86" Oct 02 13:02:44 crc kubenswrapper[4724]: E1002 13:02:44.443522 4724 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e5b878368cc85fc83d3bc35b26ebee188ae017d2e72c49fab10c6a3cc0c43b86\": container with ID starting with e5b878368cc85fc83d3bc35b26ebee188ae017d2e72c49fab10c6a3cc0c43b86 not found: ID does not exist" containerID="e5b878368cc85fc83d3bc35b26ebee188ae017d2e72c49fab10c6a3cc0c43b86" Oct 02 13:02:44 crc kubenswrapper[4724]: I1002 13:02:44.443586 4724 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e5b878368cc85fc83d3bc35b26ebee188ae017d2e72c49fab10c6a3cc0c43b86"} err="failed to get container status \"e5b878368cc85fc83d3bc35b26ebee188ae017d2e72c49fab10c6a3cc0c43b86\": rpc error: code = NotFound desc = could not find container \"e5b878368cc85fc83d3bc35b26ebee188ae017d2e72c49fab10c6a3cc0c43b86\": container with ID starting with e5b878368cc85fc83d3bc35b26ebee188ae017d2e72c49fab10c6a3cc0c43b86 not found: ID does not exist" Oct 02 13:02:44 crc kubenswrapper[4724]: I1002 13:02:44.941180 4724 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-5mv8k" Oct 02 13:02:45 crc kubenswrapper[4724]: I1002 13:02:45.072027 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/77faf449-b0a3-48ee-b35b-48bc77531443-utilities\") pod \"77faf449-b0a3-48ee-b35b-48bc77531443\" (UID: \"77faf449-b0a3-48ee-b35b-48bc77531443\") " Oct 02 13:02:45 crc kubenswrapper[4724]: I1002 13:02:45.072076 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2tzv6\" (UniqueName: \"kubernetes.io/projected/77faf449-b0a3-48ee-b35b-48bc77531443-kube-api-access-2tzv6\") pod \"77faf449-b0a3-48ee-b35b-48bc77531443\" (UID: \"77faf449-b0a3-48ee-b35b-48bc77531443\") " Oct 02 13:02:45 crc kubenswrapper[4724]: I1002 13:02:45.072132 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/77faf449-b0a3-48ee-b35b-48bc77531443-catalog-content\") pod \"77faf449-b0a3-48ee-b35b-48bc77531443\" (UID: \"77faf449-b0a3-48ee-b35b-48bc77531443\") " Oct 02 13:02:45 crc kubenswrapper[4724]: I1002 13:02:45.073327 4724 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/77faf449-b0a3-48ee-b35b-48bc77531443-utilities" (OuterVolumeSpecName: "utilities") pod "77faf449-b0a3-48ee-b35b-48bc77531443" (UID: "77faf449-b0a3-48ee-b35b-48bc77531443"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 13:02:45 crc kubenswrapper[4724]: I1002 13:02:45.081731 4724 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/77faf449-b0a3-48ee-b35b-48bc77531443-kube-api-access-2tzv6" (OuterVolumeSpecName: "kube-api-access-2tzv6") pod "77faf449-b0a3-48ee-b35b-48bc77531443" (UID: "77faf449-b0a3-48ee-b35b-48bc77531443"). InnerVolumeSpecName "kube-api-access-2tzv6". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 13:02:45 crc kubenswrapper[4724]: I1002 13:02:45.173599 4724 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/77faf449-b0a3-48ee-b35b-48bc77531443-utilities\") on node \"crc\" DevicePath \"\"" Oct 02 13:02:45 crc kubenswrapper[4724]: I1002 13:02:45.173851 4724 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2tzv6\" (UniqueName: \"kubernetes.io/projected/77faf449-b0a3-48ee-b35b-48bc77531443-kube-api-access-2tzv6\") on node \"crc\" DevicePath \"\"" Oct 02 13:02:45 crc kubenswrapper[4724]: I1002 13:02:45.180187 4724 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/77faf449-b0a3-48ee-b35b-48bc77531443-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "77faf449-b0a3-48ee-b35b-48bc77531443" (UID: "77faf449-b0a3-48ee-b35b-48bc77531443"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 13:02:45 crc kubenswrapper[4724]: I1002 13:02:45.274824 4724 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/77faf449-b0a3-48ee-b35b-48bc77531443-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 02 13:02:45 crc kubenswrapper[4724]: I1002 13:02:45.353796 4724 generic.go:334] "Generic (PLEG): container finished" podID="77faf449-b0a3-48ee-b35b-48bc77531443" containerID="970570f7638d19975b9d63961c920aed8b2a61e176ca80e665a49c08537630c1" exitCode=0 Oct 02 13:02:45 crc kubenswrapper[4724]: I1002 13:02:45.353836 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5mv8k" event={"ID":"77faf449-b0a3-48ee-b35b-48bc77531443","Type":"ContainerDied","Data":"970570f7638d19975b9d63961c920aed8b2a61e176ca80e665a49c08537630c1"} Oct 02 13:02:45 crc kubenswrapper[4724]: I1002 13:02:45.353877 4724 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-5mv8k" Oct 02 13:02:45 crc kubenswrapper[4724]: I1002 13:02:45.353902 4724 scope.go:117] "RemoveContainer" containerID="970570f7638d19975b9d63961c920aed8b2a61e176ca80e665a49c08537630c1" Oct 02 13:02:45 crc kubenswrapper[4724]: I1002 13:02:45.353887 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5mv8k" event={"ID":"77faf449-b0a3-48ee-b35b-48bc77531443","Type":"ContainerDied","Data":"4191e6853f8e0008b5fe074a0ee0734e2267df49c3a2f3eeaabb0c681b5d61d7"} Oct 02 13:02:45 crc kubenswrapper[4724]: I1002 13:02:45.367878 4724 scope.go:117] "RemoveContainer" containerID="f8f0ae206be3eef06674844f49dafa003f0bb5b8cb4e2046df326a73f21b1da2" Oct 02 13:02:45 crc kubenswrapper[4724]: I1002 13:02:45.383164 4724 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-5mv8k"] Oct 02 13:02:45 crc kubenswrapper[4724]: I1002 13:02:45.385995 4724 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-5mv8k"] Oct 02 13:02:45 crc kubenswrapper[4724]: I1002 13:02:45.405038 4724 scope.go:117] "RemoveContainer" containerID="fceaf2f7c19317b58ac427280617e4f14b467758c373158918ec051cf1996aee" Oct 02 13:02:45 crc kubenswrapper[4724]: I1002 13:02:45.420465 4724 scope.go:117] "RemoveContainer" containerID="970570f7638d19975b9d63961c920aed8b2a61e176ca80e665a49c08537630c1" Oct 02 13:02:45 crc kubenswrapper[4724]: E1002 13:02:45.420995 4724 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"970570f7638d19975b9d63961c920aed8b2a61e176ca80e665a49c08537630c1\": container with ID starting with 970570f7638d19975b9d63961c920aed8b2a61e176ca80e665a49c08537630c1 not found: ID does not exist" containerID="970570f7638d19975b9d63961c920aed8b2a61e176ca80e665a49c08537630c1" Oct 02 13:02:45 crc kubenswrapper[4724]: I1002 13:02:45.421089 4724 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"970570f7638d19975b9d63961c920aed8b2a61e176ca80e665a49c08537630c1"} err="failed to get container status \"970570f7638d19975b9d63961c920aed8b2a61e176ca80e665a49c08537630c1\": rpc error: code = NotFound desc = could not find container \"970570f7638d19975b9d63961c920aed8b2a61e176ca80e665a49c08537630c1\": container with ID starting with 970570f7638d19975b9d63961c920aed8b2a61e176ca80e665a49c08537630c1 not found: ID does not exist" Oct 02 13:02:45 crc kubenswrapper[4724]: I1002 13:02:45.421133 4724 scope.go:117] "RemoveContainer" containerID="f8f0ae206be3eef06674844f49dafa003f0bb5b8cb4e2046df326a73f21b1da2" Oct 02 13:02:45 crc kubenswrapper[4724]: E1002 13:02:45.421496 4724 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f8f0ae206be3eef06674844f49dafa003f0bb5b8cb4e2046df326a73f21b1da2\": container with ID starting with f8f0ae206be3eef06674844f49dafa003f0bb5b8cb4e2046df326a73f21b1da2 not found: ID does not exist" containerID="f8f0ae206be3eef06674844f49dafa003f0bb5b8cb4e2046df326a73f21b1da2" Oct 02 13:02:45 crc kubenswrapper[4724]: I1002 13:02:45.421576 4724 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f8f0ae206be3eef06674844f49dafa003f0bb5b8cb4e2046df326a73f21b1da2"} err="failed to get container status \"f8f0ae206be3eef06674844f49dafa003f0bb5b8cb4e2046df326a73f21b1da2\": rpc error: code = NotFound desc = could not find container \"f8f0ae206be3eef06674844f49dafa003f0bb5b8cb4e2046df326a73f21b1da2\": container with ID starting with f8f0ae206be3eef06674844f49dafa003f0bb5b8cb4e2046df326a73f21b1da2 not found: ID does not exist" Oct 02 13:02:45 crc kubenswrapper[4724]: I1002 13:02:45.421606 4724 scope.go:117] "RemoveContainer" containerID="fceaf2f7c19317b58ac427280617e4f14b467758c373158918ec051cf1996aee" Oct 02 13:02:45 crc kubenswrapper[4724]: E1002 13:02:45.421893 4724 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fceaf2f7c19317b58ac427280617e4f14b467758c373158918ec051cf1996aee\": container with ID starting with fceaf2f7c19317b58ac427280617e4f14b467758c373158918ec051cf1996aee not found: ID does not exist" containerID="fceaf2f7c19317b58ac427280617e4f14b467758c373158918ec051cf1996aee" Oct 02 13:02:45 crc kubenswrapper[4724]: I1002 13:02:45.421913 4724 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fceaf2f7c19317b58ac427280617e4f14b467758c373158918ec051cf1996aee"} err="failed to get container status \"fceaf2f7c19317b58ac427280617e4f14b467758c373158918ec051cf1996aee\": rpc error: code = NotFound desc = could not find container \"fceaf2f7c19317b58ac427280617e4f14b467758c373158918ec051cf1996aee\": container with ID starting with fceaf2f7c19317b58ac427280617e4f14b467758c373158918ec051cf1996aee not found: ID does not exist" Oct 02 13:02:46 crc kubenswrapper[4724]: I1002 13:02:46.320047 4724 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="77faf449-b0a3-48ee-b35b-48bc77531443" path="/var/lib/kubelet/pods/77faf449-b0a3-48ee-b35b-48bc77531443/volumes" Oct 02 13:02:46 crc kubenswrapper[4724]: I1002 13:02:46.320724 4724 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e0256dd2-79a4-46fb-8698-ea99d23a67de" path="/var/lib/kubelet/pods/e0256dd2-79a4-46fb-8698-ea99d23a67de/volumes" Oct 02 13:02:50 crc kubenswrapper[4724]: I1002 13:02:50.593001 4724 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-m8t6v"] Oct 02 13:03:15 crc kubenswrapper[4724]: I1002 13:03:15.622356 4724 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-authentication/oauth-openshift-558db77b4-m8t6v" podUID="7aab6527-d135-45a0-8fe0-99de1fd40d3d" containerName="oauth-openshift" containerID="cri-o://d1197afbf4f6871bdff3016acbbd4ddbf867dd3e3e9231c31dc5b8888cc765a4" gracePeriod=15 Oct 02 13:03:16 crc kubenswrapper[4724]: I1002 13:03:16.002938 4724 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-m8t6v" Oct 02 13:03:16 crc kubenswrapper[4724]: I1002 13:03:16.043623 4724 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-df7774cfb-l6b7d"] Oct 02 13:03:16 crc kubenswrapper[4724]: E1002 13:03:16.043916 4724 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a4fcd146-2be5-4691-a6db-4e9ed60b4711" containerName="extract-utilities" Oct 02 13:03:16 crc kubenswrapper[4724]: I1002 13:03:16.043933 4724 state_mem.go:107] "Deleted CPUSet assignment" podUID="a4fcd146-2be5-4691-a6db-4e9ed60b4711" containerName="extract-utilities" Oct 02 13:03:16 crc kubenswrapper[4724]: E1002 13:03:16.043954 4724 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e0256dd2-79a4-46fb-8698-ea99d23a67de" containerName="extract-content" Oct 02 13:03:16 crc kubenswrapper[4724]: I1002 13:03:16.043965 4724 state_mem.go:107] "Deleted CPUSet assignment" podUID="e0256dd2-79a4-46fb-8698-ea99d23a67de" containerName="extract-content" Oct 02 13:03:16 crc kubenswrapper[4724]: E1002 13:03:16.043980 4724 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="77faf449-b0a3-48ee-b35b-48bc77531443" containerName="extract-content" Oct 02 13:03:16 crc kubenswrapper[4724]: I1002 13:03:16.043991 4724 state_mem.go:107] "Deleted CPUSet assignment" podUID="77faf449-b0a3-48ee-b35b-48bc77531443" containerName="extract-content" Oct 02 13:03:16 crc kubenswrapper[4724]: E1002 13:03:16.044007 4724 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ca0192d8-a2a6-49b7-badb-253a4ec414e6" containerName="pruner" Oct 02 13:03:16 crc kubenswrapper[4724]: I1002 13:03:16.044019 4724 state_mem.go:107] "Deleted CPUSet assignment" podUID="ca0192d8-a2a6-49b7-badb-253a4ec414e6" containerName="pruner" Oct 02 13:03:16 crc kubenswrapper[4724]: E1002 13:03:16.044031 4724 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d3ea65d9-9080-4ccb-837c-ed218fce942c" containerName="registry-server" Oct 02 13:03:16 crc kubenswrapper[4724]: I1002 13:03:16.044040 4724 state_mem.go:107] "Deleted CPUSet assignment" podUID="d3ea65d9-9080-4ccb-837c-ed218fce942c" containerName="registry-server" Oct 02 13:03:16 crc kubenswrapper[4724]: E1002 13:03:16.044058 4724 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a4fcd146-2be5-4691-a6db-4e9ed60b4711" containerName="extract-content" Oct 02 13:03:16 crc kubenswrapper[4724]: I1002 13:03:16.044069 4724 state_mem.go:107] "Deleted CPUSet assignment" podUID="a4fcd146-2be5-4691-a6db-4e9ed60b4711" containerName="extract-content" Oct 02 13:03:16 crc kubenswrapper[4724]: E1002 13:03:16.044081 4724 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="77faf449-b0a3-48ee-b35b-48bc77531443" containerName="registry-server" Oct 02 13:03:16 crc kubenswrapper[4724]: I1002 13:03:16.044091 4724 state_mem.go:107] "Deleted CPUSet assignment" podUID="77faf449-b0a3-48ee-b35b-48bc77531443" containerName="registry-server" Oct 02 13:03:16 crc kubenswrapper[4724]: E1002 13:03:16.044105 4724 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d3ea65d9-9080-4ccb-837c-ed218fce942c" containerName="extract-utilities" Oct 02 13:03:16 crc kubenswrapper[4724]: I1002 13:03:16.044117 4724 state_mem.go:107] "Deleted CPUSet assignment" podUID="d3ea65d9-9080-4ccb-837c-ed218fce942c" containerName="extract-utilities" Oct 02 13:03:16 crc kubenswrapper[4724]: E1002 13:03:16.044134 4724 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7aab6527-d135-45a0-8fe0-99de1fd40d3d" containerName="oauth-openshift" Oct 02 13:03:16 crc kubenswrapper[4724]: I1002 13:03:16.044146 4724 state_mem.go:107] "Deleted CPUSet assignment" podUID="7aab6527-d135-45a0-8fe0-99de1fd40d3d" containerName="oauth-openshift" Oct 02 13:03:16 crc kubenswrapper[4724]: E1002 13:03:16.044163 4724 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="59bd4d63-57bf-4b08-b7f1-f0c7d733e571" containerName="collect-profiles" Oct 02 13:03:16 crc kubenswrapper[4724]: I1002 13:03:16.044176 4724 state_mem.go:107] "Deleted CPUSet assignment" podUID="59bd4d63-57bf-4b08-b7f1-f0c7d733e571" containerName="collect-profiles" Oct 02 13:03:16 crc kubenswrapper[4724]: E1002 13:03:16.044195 4724 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d3ea65d9-9080-4ccb-837c-ed218fce942c" containerName="extract-content" Oct 02 13:03:16 crc kubenswrapper[4724]: I1002 13:03:16.044207 4724 state_mem.go:107] "Deleted CPUSet assignment" podUID="d3ea65d9-9080-4ccb-837c-ed218fce942c" containerName="extract-content" Oct 02 13:03:16 crc kubenswrapper[4724]: E1002 13:03:16.044220 4724 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a4fcd146-2be5-4691-a6db-4e9ed60b4711" containerName="registry-server" Oct 02 13:03:16 crc kubenswrapper[4724]: I1002 13:03:16.044231 4724 state_mem.go:107] "Deleted CPUSet assignment" podUID="a4fcd146-2be5-4691-a6db-4e9ed60b4711" containerName="registry-server" Oct 02 13:03:16 crc kubenswrapper[4724]: E1002 13:03:16.044246 4724 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="75ffdd90-5d3e-419b-a17d-5ffced74428b" containerName="pruner" Oct 02 13:03:16 crc kubenswrapper[4724]: I1002 13:03:16.044256 4724 state_mem.go:107] "Deleted CPUSet assignment" podUID="75ffdd90-5d3e-419b-a17d-5ffced74428b" containerName="pruner" Oct 02 13:03:16 crc kubenswrapper[4724]: E1002 13:03:16.044274 4724 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="77faf449-b0a3-48ee-b35b-48bc77531443" containerName="extract-utilities" Oct 02 13:03:16 crc kubenswrapper[4724]: I1002 13:03:16.044286 4724 state_mem.go:107] "Deleted CPUSet assignment" podUID="77faf449-b0a3-48ee-b35b-48bc77531443" containerName="extract-utilities" Oct 02 13:03:16 crc kubenswrapper[4724]: E1002 13:03:16.044303 4724 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e0256dd2-79a4-46fb-8698-ea99d23a67de" containerName="extract-utilities" Oct 02 13:03:16 crc kubenswrapper[4724]: I1002 13:03:16.044314 4724 state_mem.go:107] "Deleted CPUSet assignment" podUID="e0256dd2-79a4-46fb-8698-ea99d23a67de" containerName="extract-utilities" Oct 02 13:03:16 crc kubenswrapper[4724]: E1002 13:03:16.044330 4724 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e0256dd2-79a4-46fb-8698-ea99d23a67de" containerName="registry-server" Oct 02 13:03:16 crc kubenswrapper[4724]: I1002 13:03:16.044342 4724 state_mem.go:107] "Deleted CPUSet assignment" podUID="e0256dd2-79a4-46fb-8698-ea99d23a67de" containerName="registry-server" Oct 02 13:03:16 crc kubenswrapper[4724]: I1002 13:03:16.044488 4724 memory_manager.go:354] "RemoveStaleState removing state" podUID="77faf449-b0a3-48ee-b35b-48bc77531443" containerName="registry-server" Oct 02 13:03:16 crc kubenswrapper[4724]: I1002 13:03:16.044578 4724 memory_manager.go:354] "RemoveStaleState removing state" podUID="7aab6527-d135-45a0-8fe0-99de1fd40d3d" containerName="oauth-openshift" Oct 02 13:03:16 crc kubenswrapper[4724]: I1002 13:03:16.044598 4724 memory_manager.go:354] "RemoveStaleState removing state" podUID="75ffdd90-5d3e-419b-a17d-5ffced74428b" containerName="pruner" Oct 02 13:03:16 crc kubenswrapper[4724]: I1002 13:03:16.044612 4724 memory_manager.go:354] "RemoveStaleState removing state" podUID="d3ea65d9-9080-4ccb-837c-ed218fce942c" containerName="registry-server" Oct 02 13:03:16 crc kubenswrapper[4724]: I1002 13:03:16.044628 4724 memory_manager.go:354] "RemoveStaleState removing state" podUID="e0256dd2-79a4-46fb-8698-ea99d23a67de" containerName="registry-server" Oct 02 13:03:16 crc kubenswrapper[4724]: I1002 13:03:16.044645 4724 memory_manager.go:354] "RemoveStaleState removing state" podUID="59bd4d63-57bf-4b08-b7f1-f0c7d733e571" containerName="collect-profiles" Oct 02 13:03:16 crc kubenswrapper[4724]: I1002 13:03:16.044657 4724 memory_manager.go:354] "RemoveStaleState removing state" podUID="ca0192d8-a2a6-49b7-badb-253a4ec414e6" containerName="pruner" Oct 02 13:03:16 crc kubenswrapper[4724]: I1002 13:03:16.044672 4724 memory_manager.go:354] "RemoveStaleState removing state" podUID="a4fcd146-2be5-4691-a6db-4e9ed60b4711" containerName="registry-server" Oct 02 13:03:16 crc kubenswrapper[4724]: I1002 13:03:16.045172 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-df7774cfb-l6b7d" Oct 02 13:03:16 crc kubenswrapper[4724]: I1002 13:03:16.053524 4724 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-df7774cfb-l6b7d"] Oct 02 13:03:16 crc kubenswrapper[4724]: I1002 13:03:16.082612 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/7aab6527-d135-45a0-8fe0-99de1fd40d3d-v4-0-config-system-ocp-branding-template\") pod \"7aab6527-d135-45a0-8fe0-99de1fd40d3d\" (UID: \"7aab6527-d135-45a0-8fe0-99de1fd40d3d\") " Oct 02 13:03:16 crc kubenswrapper[4724]: I1002 13:03:16.082680 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/7aab6527-d135-45a0-8fe0-99de1fd40d3d-v4-0-config-system-service-ca\") pod \"7aab6527-d135-45a0-8fe0-99de1fd40d3d\" (UID: \"7aab6527-d135-45a0-8fe0-99de1fd40d3d\") " Oct 02 13:03:16 crc kubenswrapper[4724]: I1002 13:03:16.082702 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7aab6527-d135-45a0-8fe0-99de1fd40d3d-v4-0-config-system-trusted-ca-bundle\") pod \"7aab6527-d135-45a0-8fe0-99de1fd40d3d\" (UID: \"7aab6527-d135-45a0-8fe0-99de1fd40d3d\") " Oct 02 13:03:16 crc kubenswrapper[4724]: I1002 13:03:16.082728 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/7aab6527-d135-45a0-8fe0-99de1fd40d3d-v4-0-config-user-template-provider-selection\") pod \"7aab6527-d135-45a0-8fe0-99de1fd40d3d\" (UID: \"7aab6527-d135-45a0-8fe0-99de1fd40d3d\") " Oct 02 13:03:16 crc kubenswrapper[4724]: I1002 13:03:16.082751 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/7aab6527-d135-45a0-8fe0-99de1fd40d3d-v4-0-config-user-template-error\") pod \"7aab6527-d135-45a0-8fe0-99de1fd40d3d\" (UID: \"7aab6527-d135-45a0-8fe0-99de1fd40d3d\") " Oct 02 13:03:16 crc kubenswrapper[4724]: I1002 13:03:16.082770 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/7aab6527-d135-45a0-8fe0-99de1fd40d3d-v4-0-config-user-idp-0-file-data\") pod \"7aab6527-d135-45a0-8fe0-99de1fd40d3d\" (UID: \"7aab6527-d135-45a0-8fe0-99de1fd40d3d\") " Oct 02 13:03:16 crc kubenswrapper[4724]: I1002 13:03:16.082840 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/7aab6527-d135-45a0-8fe0-99de1fd40d3d-v4-0-config-system-serving-cert\") pod \"7aab6527-d135-45a0-8fe0-99de1fd40d3d\" (UID: \"7aab6527-d135-45a0-8fe0-99de1fd40d3d\") " Oct 02 13:03:16 crc kubenswrapper[4724]: I1002 13:03:16.082870 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/7aab6527-d135-45a0-8fe0-99de1fd40d3d-audit-policies\") pod \"7aab6527-d135-45a0-8fe0-99de1fd40d3d\" (UID: \"7aab6527-d135-45a0-8fe0-99de1fd40d3d\") " Oct 02 13:03:16 crc kubenswrapper[4724]: I1002 13:03:16.082934 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/7aab6527-d135-45a0-8fe0-99de1fd40d3d-v4-0-config-user-template-login\") pod \"7aab6527-d135-45a0-8fe0-99de1fd40d3d\" (UID: \"7aab6527-d135-45a0-8fe0-99de1fd40d3d\") " Oct 02 13:03:16 crc kubenswrapper[4724]: I1002 13:03:16.082969 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/7aab6527-d135-45a0-8fe0-99de1fd40d3d-audit-dir\") pod \"7aab6527-d135-45a0-8fe0-99de1fd40d3d\" (UID: \"7aab6527-d135-45a0-8fe0-99de1fd40d3d\") " Oct 02 13:03:16 crc kubenswrapper[4724]: I1002 13:03:16.082996 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/7aab6527-d135-45a0-8fe0-99de1fd40d3d-v4-0-config-system-session\") pod \"7aab6527-d135-45a0-8fe0-99de1fd40d3d\" (UID: \"7aab6527-d135-45a0-8fe0-99de1fd40d3d\") " Oct 02 13:03:16 crc kubenswrapper[4724]: I1002 13:03:16.083035 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/7aab6527-d135-45a0-8fe0-99de1fd40d3d-v4-0-config-system-cliconfig\") pod \"7aab6527-d135-45a0-8fe0-99de1fd40d3d\" (UID: \"7aab6527-d135-45a0-8fe0-99de1fd40d3d\") " Oct 02 13:03:16 crc kubenswrapper[4724]: I1002 13:03:16.083064 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/7aab6527-d135-45a0-8fe0-99de1fd40d3d-v4-0-config-system-router-certs\") pod \"7aab6527-d135-45a0-8fe0-99de1fd40d3d\" (UID: \"7aab6527-d135-45a0-8fe0-99de1fd40d3d\") " Oct 02 13:03:16 crc kubenswrapper[4724]: I1002 13:03:16.083092 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-szdn2\" (UniqueName: \"kubernetes.io/projected/7aab6527-d135-45a0-8fe0-99de1fd40d3d-kube-api-access-szdn2\") pod \"7aab6527-d135-45a0-8fe0-99de1fd40d3d\" (UID: \"7aab6527-d135-45a0-8fe0-99de1fd40d3d\") " Oct 02 13:03:16 crc kubenswrapper[4724]: I1002 13:03:16.085138 4724 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7aab6527-d135-45a0-8fe0-99de1fd40d3d-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "7aab6527-d135-45a0-8fe0-99de1fd40d3d" (UID: "7aab6527-d135-45a0-8fe0-99de1fd40d3d"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 13:03:16 crc kubenswrapper[4724]: I1002 13:03:16.085489 4724 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7aab6527-d135-45a0-8fe0-99de1fd40d3d-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "7aab6527-d135-45a0-8fe0-99de1fd40d3d" (UID: "7aab6527-d135-45a0-8fe0-99de1fd40d3d"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 13:03:16 crc kubenswrapper[4724]: I1002 13:03:16.085581 4724 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7aab6527-d135-45a0-8fe0-99de1fd40d3d-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "7aab6527-d135-45a0-8fe0-99de1fd40d3d" (UID: "7aab6527-d135-45a0-8fe0-99de1fd40d3d"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 02 13:03:16 crc kubenswrapper[4724]: I1002 13:03:16.085630 4724 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7aab6527-d135-45a0-8fe0-99de1fd40d3d-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "7aab6527-d135-45a0-8fe0-99de1fd40d3d" (UID: "7aab6527-d135-45a0-8fe0-99de1fd40d3d"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 13:03:16 crc kubenswrapper[4724]: I1002 13:03:16.086389 4724 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7aab6527-d135-45a0-8fe0-99de1fd40d3d-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "7aab6527-d135-45a0-8fe0-99de1fd40d3d" (UID: "7aab6527-d135-45a0-8fe0-99de1fd40d3d"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 13:03:16 crc kubenswrapper[4724]: I1002 13:03:16.090666 4724 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7aab6527-d135-45a0-8fe0-99de1fd40d3d-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "7aab6527-d135-45a0-8fe0-99de1fd40d3d" (UID: "7aab6527-d135-45a0-8fe0-99de1fd40d3d"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 13:03:16 crc kubenswrapper[4724]: I1002 13:03:16.090819 4724 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7aab6527-d135-45a0-8fe0-99de1fd40d3d-kube-api-access-szdn2" (OuterVolumeSpecName: "kube-api-access-szdn2") pod "7aab6527-d135-45a0-8fe0-99de1fd40d3d" (UID: "7aab6527-d135-45a0-8fe0-99de1fd40d3d"). InnerVolumeSpecName "kube-api-access-szdn2". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 13:03:16 crc kubenswrapper[4724]: I1002 13:03:16.090958 4724 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7aab6527-d135-45a0-8fe0-99de1fd40d3d-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "7aab6527-d135-45a0-8fe0-99de1fd40d3d" (UID: "7aab6527-d135-45a0-8fe0-99de1fd40d3d"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 13:03:16 crc kubenswrapper[4724]: I1002 13:03:16.091193 4724 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7aab6527-d135-45a0-8fe0-99de1fd40d3d-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "7aab6527-d135-45a0-8fe0-99de1fd40d3d" (UID: "7aab6527-d135-45a0-8fe0-99de1fd40d3d"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 13:03:16 crc kubenswrapper[4724]: I1002 13:03:16.091999 4724 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7aab6527-d135-45a0-8fe0-99de1fd40d3d-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "7aab6527-d135-45a0-8fe0-99de1fd40d3d" (UID: "7aab6527-d135-45a0-8fe0-99de1fd40d3d"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 13:03:16 crc kubenswrapper[4724]: I1002 13:03:16.092663 4724 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7aab6527-d135-45a0-8fe0-99de1fd40d3d-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "7aab6527-d135-45a0-8fe0-99de1fd40d3d" (UID: "7aab6527-d135-45a0-8fe0-99de1fd40d3d"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 13:03:16 crc kubenswrapper[4724]: I1002 13:03:16.092907 4724 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7aab6527-d135-45a0-8fe0-99de1fd40d3d-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "7aab6527-d135-45a0-8fe0-99de1fd40d3d" (UID: "7aab6527-d135-45a0-8fe0-99de1fd40d3d"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 13:03:16 crc kubenswrapper[4724]: I1002 13:03:16.094306 4724 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7aab6527-d135-45a0-8fe0-99de1fd40d3d-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "7aab6527-d135-45a0-8fe0-99de1fd40d3d" (UID: "7aab6527-d135-45a0-8fe0-99de1fd40d3d"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 13:03:16 crc kubenswrapper[4724]: I1002 13:03:16.099983 4724 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7aab6527-d135-45a0-8fe0-99de1fd40d3d-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "7aab6527-d135-45a0-8fe0-99de1fd40d3d" (UID: "7aab6527-d135-45a0-8fe0-99de1fd40d3d"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 13:03:16 crc kubenswrapper[4724]: I1002 13:03:16.184397 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/f6f6115d-7764-484e-840a-c2fcc28dfe3c-v4-0-config-system-service-ca\") pod \"oauth-openshift-df7774cfb-l6b7d\" (UID: \"f6f6115d-7764-484e-840a-c2fcc28dfe3c\") " pod="openshift-authentication/oauth-openshift-df7774cfb-l6b7d" Oct 02 13:03:16 crc kubenswrapper[4724]: I1002 13:03:16.184464 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/f6f6115d-7764-484e-840a-c2fcc28dfe3c-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-df7774cfb-l6b7d\" (UID: \"f6f6115d-7764-484e-840a-c2fcc28dfe3c\") " pod="openshift-authentication/oauth-openshift-df7774cfb-l6b7d" Oct 02 13:03:16 crc kubenswrapper[4724]: I1002 13:03:16.184488 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/f6f6115d-7764-484e-840a-c2fcc28dfe3c-audit-policies\") pod \"oauth-openshift-df7774cfb-l6b7d\" (UID: \"f6f6115d-7764-484e-840a-c2fcc28dfe3c\") " pod="openshift-authentication/oauth-openshift-df7774cfb-l6b7d" Oct 02 13:03:16 crc kubenswrapper[4724]: I1002 13:03:16.184572 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/f6f6115d-7764-484e-840a-c2fcc28dfe3c-v4-0-config-system-router-certs\") pod \"oauth-openshift-df7774cfb-l6b7d\" (UID: \"f6f6115d-7764-484e-840a-c2fcc28dfe3c\") " pod="openshift-authentication/oauth-openshift-df7774cfb-l6b7d" Oct 02 13:03:16 crc kubenswrapper[4724]: I1002 13:03:16.184597 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/f6f6115d-7764-484e-840a-c2fcc28dfe3c-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-df7774cfb-l6b7d\" (UID: \"f6f6115d-7764-484e-840a-c2fcc28dfe3c\") " pod="openshift-authentication/oauth-openshift-df7774cfb-l6b7d" Oct 02 13:03:16 crc kubenswrapper[4724]: I1002 13:03:16.184622 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f6f6115d-7764-484e-840a-c2fcc28dfe3c-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-df7774cfb-l6b7d\" (UID: \"f6f6115d-7764-484e-840a-c2fcc28dfe3c\") " pod="openshift-authentication/oauth-openshift-df7774cfb-l6b7d" Oct 02 13:03:16 crc kubenswrapper[4724]: I1002 13:03:16.184646 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/f6f6115d-7764-484e-840a-c2fcc28dfe3c-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-df7774cfb-l6b7d\" (UID: \"f6f6115d-7764-484e-840a-c2fcc28dfe3c\") " pod="openshift-authentication/oauth-openshift-df7774cfb-l6b7d" Oct 02 13:03:16 crc kubenswrapper[4724]: I1002 13:03:16.184680 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/f6f6115d-7764-484e-840a-c2fcc28dfe3c-v4-0-config-user-template-error\") pod \"oauth-openshift-df7774cfb-l6b7d\" (UID: \"f6f6115d-7764-484e-840a-c2fcc28dfe3c\") " pod="openshift-authentication/oauth-openshift-df7774cfb-l6b7d" Oct 02 13:03:16 crc kubenswrapper[4724]: I1002 13:03:16.184752 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wjc4k\" (UniqueName: \"kubernetes.io/projected/f6f6115d-7764-484e-840a-c2fcc28dfe3c-kube-api-access-wjc4k\") pod \"oauth-openshift-df7774cfb-l6b7d\" (UID: \"f6f6115d-7764-484e-840a-c2fcc28dfe3c\") " pod="openshift-authentication/oauth-openshift-df7774cfb-l6b7d" Oct 02 13:03:16 crc kubenswrapper[4724]: I1002 13:03:16.184868 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/f6f6115d-7764-484e-840a-c2fcc28dfe3c-v4-0-config-system-session\") pod \"oauth-openshift-df7774cfb-l6b7d\" (UID: \"f6f6115d-7764-484e-840a-c2fcc28dfe3c\") " pod="openshift-authentication/oauth-openshift-df7774cfb-l6b7d" Oct 02 13:03:16 crc kubenswrapper[4724]: I1002 13:03:16.184895 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/f6f6115d-7764-484e-840a-c2fcc28dfe3c-v4-0-config-system-cliconfig\") pod \"oauth-openshift-df7774cfb-l6b7d\" (UID: \"f6f6115d-7764-484e-840a-c2fcc28dfe3c\") " pod="openshift-authentication/oauth-openshift-df7774cfb-l6b7d" Oct 02 13:03:16 crc kubenswrapper[4724]: I1002 13:03:16.184934 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f6f6115d-7764-484e-840a-c2fcc28dfe3c-audit-dir\") pod \"oauth-openshift-df7774cfb-l6b7d\" (UID: \"f6f6115d-7764-484e-840a-c2fcc28dfe3c\") " pod="openshift-authentication/oauth-openshift-df7774cfb-l6b7d" Oct 02 13:03:16 crc kubenswrapper[4724]: I1002 13:03:16.184962 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/f6f6115d-7764-484e-840a-c2fcc28dfe3c-v4-0-config-system-serving-cert\") pod \"oauth-openshift-df7774cfb-l6b7d\" (UID: \"f6f6115d-7764-484e-840a-c2fcc28dfe3c\") " pod="openshift-authentication/oauth-openshift-df7774cfb-l6b7d" Oct 02 13:03:16 crc kubenswrapper[4724]: I1002 13:03:16.184983 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/f6f6115d-7764-484e-840a-c2fcc28dfe3c-v4-0-config-user-template-login\") pod \"oauth-openshift-df7774cfb-l6b7d\" (UID: \"f6f6115d-7764-484e-840a-c2fcc28dfe3c\") " pod="openshift-authentication/oauth-openshift-df7774cfb-l6b7d" Oct 02 13:03:16 crc kubenswrapper[4724]: I1002 13:03:16.185070 4724 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/7aab6527-d135-45a0-8fe0-99de1fd40d3d-audit-policies\") on node \"crc\" DevicePath \"\"" Oct 02 13:03:16 crc kubenswrapper[4724]: I1002 13:03:16.185085 4724 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/7aab6527-d135-45a0-8fe0-99de1fd40d3d-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 02 13:03:16 crc kubenswrapper[4724]: I1002 13:03:16.185099 4724 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/7aab6527-d135-45a0-8fe0-99de1fd40d3d-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Oct 02 13:03:16 crc kubenswrapper[4724]: I1002 13:03:16.185112 4724 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/7aab6527-d135-45a0-8fe0-99de1fd40d3d-audit-dir\") on node \"crc\" DevicePath \"\"" Oct 02 13:03:16 crc kubenswrapper[4724]: I1002 13:03:16.185126 4724 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/7aab6527-d135-45a0-8fe0-99de1fd40d3d-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Oct 02 13:03:16 crc kubenswrapper[4724]: I1002 13:03:16.185138 4724 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/7aab6527-d135-45a0-8fe0-99de1fd40d3d-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Oct 02 13:03:16 crc kubenswrapper[4724]: I1002 13:03:16.185150 4724 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/7aab6527-d135-45a0-8fe0-99de1fd40d3d-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Oct 02 13:03:16 crc kubenswrapper[4724]: I1002 13:03:16.185164 4724 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-szdn2\" (UniqueName: \"kubernetes.io/projected/7aab6527-d135-45a0-8fe0-99de1fd40d3d-kube-api-access-szdn2\") on node \"crc\" DevicePath \"\"" Oct 02 13:03:16 crc kubenswrapper[4724]: I1002 13:03:16.185177 4724 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/7aab6527-d135-45a0-8fe0-99de1fd40d3d-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Oct 02 13:03:16 crc kubenswrapper[4724]: I1002 13:03:16.185189 4724 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/7aab6527-d135-45a0-8fe0-99de1fd40d3d-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Oct 02 13:03:16 crc kubenswrapper[4724]: I1002 13:03:16.185202 4724 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7aab6527-d135-45a0-8fe0-99de1fd40d3d-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 13:03:16 crc kubenswrapper[4724]: I1002 13:03:16.185219 4724 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/7aab6527-d135-45a0-8fe0-99de1fd40d3d-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Oct 02 13:03:16 crc kubenswrapper[4724]: I1002 13:03:16.185232 4724 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/7aab6527-d135-45a0-8fe0-99de1fd40d3d-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Oct 02 13:03:16 crc kubenswrapper[4724]: I1002 13:03:16.185246 4724 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/7aab6527-d135-45a0-8fe0-99de1fd40d3d-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Oct 02 13:03:16 crc kubenswrapper[4724]: I1002 13:03:16.286238 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/f6f6115d-7764-484e-840a-c2fcc28dfe3c-v4-0-config-system-cliconfig\") pod \"oauth-openshift-df7774cfb-l6b7d\" (UID: \"f6f6115d-7764-484e-840a-c2fcc28dfe3c\") " pod="openshift-authentication/oauth-openshift-df7774cfb-l6b7d" Oct 02 13:03:16 crc kubenswrapper[4724]: I1002 13:03:16.286349 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/f6f6115d-7764-484e-840a-c2fcc28dfe3c-v4-0-config-system-session\") pod \"oauth-openshift-df7774cfb-l6b7d\" (UID: \"f6f6115d-7764-484e-840a-c2fcc28dfe3c\") " pod="openshift-authentication/oauth-openshift-df7774cfb-l6b7d" Oct 02 13:03:16 crc kubenswrapper[4724]: I1002 13:03:16.286408 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f6f6115d-7764-484e-840a-c2fcc28dfe3c-audit-dir\") pod \"oauth-openshift-df7774cfb-l6b7d\" (UID: \"f6f6115d-7764-484e-840a-c2fcc28dfe3c\") " pod="openshift-authentication/oauth-openshift-df7774cfb-l6b7d" Oct 02 13:03:16 crc kubenswrapper[4724]: I1002 13:03:16.286443 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/f6f6115d-7764-484e-840a-c2fcc28dfe3c-v4-0-config-system-serving-cert\") pod \"oauth-openshift-df7774cfb-l6b7d\" (UID: \"f6f6115d-7764-484e-840a-c2fcc28dfe3c\") " pod="openshift-authentication/oauth-openshift-df7774cfb-l6b7d" Oct 02 13:03:16 crc kubenswrapper[4724]: I1002 13:03:16.286465 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/f6f6115d-7764-484e-840a-c2fcc28dfe3c-v4-0-config-user-template-login\") pod \"oauth-openshift-df7774cfb-l6b7d\" (UID: \"f6f6115d-7764-484e-840a-c2fcc28dfe3c\") " pod="openshift-authentication/oauth-openshift-df7774cfb-l6b7d" Oct 02 13:03:16 crc kubenswrapper[4724]: I1002 13:03:16.286489 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/f6f6115d-7764-484e-840a-c2fcc28dfe3c-v4-0-config-system-service-ca\") pod \"oauth-openshift-df7774cfb-l6b7d\" (UID: \"f6f6115d-7764-484e-840a-c2fcc28dfe3c\") " pod="openshift-authentication/oauth-openshift-df7774cfb-l6b7d" Oct 02 13:03:16 crc kubenswrapper[4724]: I1002 13:03:16.286517 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/f6f6115d-7764-484e-840a-c2fcc28dfe3c-audit-policies\") pod \"oauth-openshift-df7774cfb-l6b7d\" (UID: \"f6f6115d-7764-484e-840a-c2fcc28dfe3c\") " pod="openshift-authentication/oauth-openshift-df7774cfb-l6b7d" Oct 02 13:03:16 crc kubenswrapper[4724]: I1002 13:03:16.286569 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/f6f6115d-7764-484e-840a-c2fcc28dfe3c-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-df7774cfb-l6b7d\" (UID: \"f6f6115d-7764-484e-840a-c2fcc28dfe3c\") " pod="openshift-authentication/oauth-openshift-df7774cfb-l6b7d" Oct 02 13:03:16 crc kubenswrapper[4724]: I1002 13:03:16.286654 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/f6f6115d-7764-484e-840a-c2fcc28dfe3c-v4-0-config-system-router-certs\") pod \"oauth-openshift-df7774cfb-l6b7d\" (UID: \"f6f6115d-7764-484e-840a-c2fcc28dfe3c\") " pod="openshift-authentication/oauth-openshift-df7774cfb-l6b7d" Oct 02 13:03:16 crc kubenswrapper[4724]: I1002 13:03:16.286669 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f6f6115d-7764-484e-840a-c2fcc28dfe3c-audit-dir\") pod \"oauth-openshift-df7774cfb-l6b7d\" (UID: \"f6f6115d-7764-484e-840a-c2fcc28dfe3c\") " pod="openshift-authentication/oauth-openshift-df7774cfb-l6b7d" Oct 02 13:03:16 crc kubenswrapper[4724]: I1002 13:03:16.286691 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/f6f6115d-7764-484e-840a-c2fcc28dfe3c-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-df7774cfb-l6b7d\" (UID: \"f6f6115d-7764-484e-840a-c2fcc28dfe3c\") " pod="openshift-authentication/oauth-openshift-df7774cfb-l6b7d" Oct 02 13:03:16 crc kubenswrapper[4724]: I1002 13:03:16.286843 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f6f6115d-7764-484e-840a-c2fcc28dfe3c-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-df7774cfb-l6b7d\" (UID: \"f6f6115d-7764-484e-840a-c2fcc28dfe3c\") " pod="openshift-authentication/oauth-openshift-df7774cfb-l6b7d" Oct 02 13:03:16 crc kubenswrapper[4724]: I1002 13:03:16.286900 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/f6f6115d-7764-484e-840a-c2fcc28dfe3c-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-df7774cfb-l6b7d\" (UID: \"f6f6115d-7764-484e-840a-c2fcc28dfe3c\") " pod="openshift-authentication/oauth-openshift-df7774cfb-l6b7d" Oct 02 13:03:16 crc kubenswrapper[4724]: I1002 13:03:16.286974 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/f6f6115d-7764-484e-840a-c2fcc28dfe3c-v4-0-config-user-template-error\") pod \"oauth-openshift-df7774cfb-l6b7d\" (UID: \"f6f6115d-7764-484e-840a-c2fcc28dfe3c\") " pod="openshift-authentication/oauth-openshift-df7774cfb-l6b7d" Oct 02 13:03:16 crc kubenswrapper[4724]: I1002 13:03:16.287012 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wjc4k\" (UniqueName: \"kubernetes.io/projected/f6f6115d-7764-484e-840a-c2fcc28dfe3c-kube-api-access-wjc4k\") pod \"oauth-openshift-df7774cfb-l6b7d\" (UID: \"f6f6115d-7764-484e-840a-c2fcc28dfe3c\") " pod="openshift-authentication/oauth-openshift-df7774cfb-l6b7d" Oct 02 13:03:16 crc kubenswrapper[4724]: I1002 13:03:16.288058 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/f6f6115d-7764-484e-840a-c2fcc28dfe3c-v4-0-config-system-service-ca\") pod \"oauth-openshift-df7774cfb-l6b7d\" (UID: \"f6f6115d-7764-484e-840a-c2fcc28dfe3c\") " pod="openshift-authentication/oauth-openshift-df7774cfb-l6b7d" Oct 02 13:03:16 crc kubenswrapper[4724]: I1002 13:03:16.288061 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/f6f6115d-7764-484e-840a-c2fcc28dfe3c-v4-0-config-system-cliconfig\") pod \"oauth-openshift-df7774cfb-l6b7d\" (UID: \"f6f6115d-7764-484e-840a-c2fcc28dfe3c\") " pod="openshift-authentication/oauth-openshift-df7774cfb-l6b7d" Oct 02 13:03:16 crc kubenswrapper[4724]: I1002 13:03:16.289197 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f6f6115d-7764-484e-840a-c2fcc28dfe3c-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-df7774cfb-l6b7d\" (UID: \"f6f6115d-7764-484e-840a-c2fcc28dfe3c\") " pod="openshift-authentication/oauth-openshift-df7774cfb-l6b7d" Oct 02 13:03:16 crc kubenswrapper[4724]: I1002 13:03:16.289331 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/f6f6115d-7764-484e-840a-c2fcc28dfe3c-audit-policies\") pod \"oauth-openshift-df7774cfb-l6b7d\" (UID: \"f6f6115d-7764-484e-840a-c2fcc28dfe3c\") " pod="openshift-authentication/oauth-openshift-df7774cfb-l6b7d" Oct 02 13:03:16 crc kubenswrapper[4724]: I1002 13:03:16.290248 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/f6f6115d-7764-484e-840a-c2fcc28dfe3c-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-df7774cfb-l6b7d\" (UID: \"f6f6115d-7764-484e-840a-c2fcc28dfe3c\") " pod="openshift-authentication/oauth-openshift-df7774cfb-l6b7d" Oct 02 13:03:16 crc kubenswrapper[4724]: I1002 13:03:16.290821 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/f6f6115d-7764-484e-840a-c2fcc28dfe3c-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-df7774cfb-l6b7d\" (UID: \"f6f6115d-7764-484e-840a-c2fcc28dfe3c\") " pod="openshift-authentication/oauth-openshift-df7774cfb-l6b7d" Oct 02 13:03:16 crc kubenswrapper[4724]: I1002 13:03:16.291181 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/f6f6115d-7764-484e-840a-c2fcc28dfe3c-v4-0-config-system-serving-cert\") pod \"oauth-openshift-df7774cfb-l6b7d\" (UID: \"f6f6115d-7764-484e-840a-c2fcc28dfe3c\") " pod="openshift-authentication/oauth-openshift-df7774cfb-l6b7d" Oct 02 13:03:16 crc kubenswrapper[4724]: I1002 13:03:16.292451 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/f6f6115d-7764-484e-840a-c2fcc28dfe3c-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-df7774cfb-l6b7d\" (UID: \"f6f6115d-7764-484e-840a-c2fcc28dfe3c\") " pod="openshift-authentication/oauth-openshift-df7774cfb-l6b7d" Oct 02 13:03:16 crc kubenswrapper[4724]: I1002 13:03:16.294230 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/f6f6115d-7764-484e-840a-c2fcc28dfe3c-v4-0-config-system-router-certs\") pod \"oauth-openshift-df7774cfb-l6b7d\" (UID: \"f6f6115d-7764-484e-840a-c2fcc28dfe3c\") " pod="openshift-authentication/oauth-openshift-df7774cfb-l6b7d" Oct 02 13:03:16 crc kubenswrapper[4724]: I1002 13:03:16.295717 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/f6f6115d-7764-484e-840a-c2fcc28dfe3c-v4-0-config-user-template-error\") pod \"oauth-openshift-df7774cfb-l6b7d\" (UID: \"f6f6115d-7764-484e-840a-c2fcc28dfe3c\") " pod="openshift-authentication/oauth-openshift-df7774cfb-l6b7d" Oct 02 13:03:16 crc kubenswrapper[4724]: I1002 13:03:16.296098 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/f6f6115d-7764-484e-840a-c2fcc28dfe3c-v4-0-config-user-template-login\") pod \"oauth-openshift-df7774cfb-l6b7d\" (UID: \"f6f6115d-7764-484e-840a-c2fcc28dfe3c\") " pod="openshift-authentication/oauth-openshift-df7774cfb-l6b7d" Oct 02 13:03:16 crc kubenswrapper[4724]: I1002 13:03:16.296744 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/f6f6115d-7764-484e-840a-c2fcc28dfe3c-v4-0-config-system-session\") pod \"oauth-openshift-df7774cfb-l6b7d\" (UID: \"f6f6115d-7764-484e-840a-c2fcc28dfe3c\") " pod="openshift-authentication/oauth-openshift-df7774cfb-l6b7d" Oct 02 13:03:16 crc kubenswrapper[4724]: I1002 13:03:16.306501 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wjc4k\" (UniqueName: \"kubernetes.io/projected/f6f6115d-7764-484e-840a-c2fcc28dfe3c-kube-api-access-wjc4k\") pod \"oauth-openshift-df7774cfb-l6b7d\" (UID: \"f6f6115d-7764-484e-840a-c2fcc28dfe3c\") " pod="openshift-authentication/oauth-openshift-df7774cfb-l6b7d" Oct 02 13:03:16 crc kubenswrapper[4724]: I1002 13:03:16.374436 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-df7774cfb-l6b7d" Oct 02 13:03:16 crc kubenswrapper[4724]: I1002 13:03:16.514108 4724 generic.go:334] "Generic (PLEG): container finished" podID="7aab6527-d135-45a0-8fe0-99de1fd40d3d" containerID="d1197afbf4f6871bdff3016acbbd4ddbf867dd3e3e9231c31dc5b8888cc765a4" exitCode=0 Oct 02 13:03:16 crc kubenswrapper[4724]: I1002 13:03:16.514173 4724 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-m8t6v" Oct 02 13:03:16 crc kubenswrapper[4724]: I1002 13:03:16.514164 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-m8t6v" event={"ID":"7aab6527-d135-45a0-8fe0-99de1fd40d3d","Type":"ContainerDied","Data":"d1197afbf4f6871bdff3016acbbd4ddbf867dd3e3e9231c31dc5b8888cc765a4"} Oct 02 13:03:16 crc kubenswrapper[4724]: I1002 13:03:16.514524 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-m8t6v" event={"ID":"7aab6527-d135-45a0-8fe0-99de1fd40d3d","Type":"ContainerDied","Data":"155b2ba8d9ceb546e3f310fa236b3111ccb53c91a4470cb248147158f27b9aef"} Oct 02 13:03:16 crc kubenswrapper[4724]: I1002 13:03:16.514564 4724 scope.go:117] "RemoveContainer" containerID="d1197afbf4f6871bdff3016acbbd4ddbf867dd3e3e9231c31dc5b8888cc765a4" Oct 02 13:03:16 crc kubenswrapper[4724]: I1002 13:03:16.536272 4724 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-m8t6v"] Oct 02 13:03:16 crc kubenswrapper[4724]: I1002 13:03:16.540372 4724 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-m8t6v"] Oct 02 13:03:16 crc kubenswrapper[4724]: I1002 13:03:16.549748 4724 scope.go:117] "RemoveContainer" containerID="d1197afbf4f6871bdff3016acbbd4ddbf867dd3e3e9231c31dc5b8888cc765a4" Oct 02 13:03:16 crc kubenswrapper[4724]: E1002 13:03:16.550111 4724 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d1197afbf4f6871bdff3016acbbd4ddbf867dd3e3e9231c31dc5b8888cc765a4\": container with ID starting with d1197afbf4f6871bdff3016acbbd4ddbf867dd3e3e9231c31dc5b8888cc765a4 not found: ID does not exist" containerID="d1197afbf4f6871bdff3016acbbd4ddbf867dd3e3e9231c31dc5b8888cc765a4" Oct 02 13:03:16 crc kubenswrapper[4724]: I1002 13:03:16.550141 4724 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d1197afbf4f6871bdff3016acbbd4ddbf867dd3e3e9231c31dc5b8888cc765a4"} err="failed to get container status \"d1197afbf4f6871bdff3016acbbd4ddbf867dd3e3e9231c31dc5b8888cc765a4\": rpc error: code = NotFound desc = could not find container \"d1197afbf4f6871bdff3016acbbd4ddbf867dd3e3e9231c31dc5b8888cc765a4\": container with ID starting with d1197afbf4f6871bdff3016acbbd4ddbf867dd3e3e9231c31dc5b8888cc765a4 not found: ID does not exist" Oct 02 13:03:16 crc kubenswrapper[4724]: I1002 13:03:16.814681 4724 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-df7774cfb-l6b7d"] Oct 02 13:03:16 crc kubenswrapper[4724]: W1002 13:03:16.830756 4724 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf6f6115d_7764_484e_840a_c2fcc28dfe3c.slice/crio-d8d421251448b55e1a8ee369f1090c07c3865ee97354047655f43218a6dcaa51 WatchSource:0}: Error finding container d8d421251448b55e1a8ee369f1090c07c3865ee97354047655f43218a6dcaa51: Status 404 returned error can't find the container with id d8d421251448b55e1a8ee369f1090c07c3865ee97354047655f43218a6dcaa51 Oct 02 13:03:17 crc kubenswrapper[4724]: I1002 13:03:17.523164 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-df7774cfb-l6b7d" event={"ID":"f6f6115d-7764-484e-840a-c2fcc28dfe3c","Type":"ContainerStarted","Data":"d8135dcf7028a89a546fc3dcc238786fa65867bdf23a4dc0b4234464743f2276"} Oct 02 13:03:17 crc kubenswrapper[4724]: I1002 13:03:17.523563 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-df7774cfb-l6b7d" event={"ID":"f6f6115d-7764-484e-840a-c2fcc28dfe3c","Type":"ContainerStarted","Data":"d8d421251448b55e1a8ee369f1090c07c3865ee97354047655f43218a6dcaa51"} Oct 02 13:03:17 crc kubenswrapper[4724]: I1002 13:03:17.524019 4724 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-df7774cfb-l6b7d" Oct 02 13:03:17 crc kubenswrapper[4724]: I1002 13:03:17.549943 4724 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-df7774cfb-l6b7d" podStartSLOduration=27.549917627 podStartE2EDuration="27.549917627s" podCreationTimestamp="2025-10-02 13:02:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 13:03:17.543252858 +0000 UTC m=+261.998011999" watchObservedRunningTime="2025-10-02 13:03:17.549917627 +0000 UTC m=+262.004676758" Oct 02 13:03:17 crc kubenswrapper[4724]: I1002 13:03:17.730287 4724 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-df7774cfb-l6b7d" Oct 02 13:03:18 crc kubenswrapper[4724]: I1002 13:03:18.319549 4724 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7aab6527-d135-45a0-8fe0-99de1fd40d3d" path="/var/lib/kubelet/pods/7aab6527-d135-45a0-8fe0-99de1fd40d3d/volumes" Oct 02 13:03:36 crc kubenswrapper[4724]: I1002 13:03:36.272845 4724 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-vqs8m"] Oct 02 13:03:36 crc kubenswrapper[4724]: I1002 13:03:36.273855 4724 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-vqs8m" podUID="53ecbc18-3d93-4ee9-bd02-e3e99db2a82d" containerName="registry-server" containerID="cri-o://b27dc3cdb4ccad2c21ab78ceb037953bcfe9567e41c01e14e1d80d387123ee01" gracePeriod=30 Oct 02 13:03:36 crc kubenswrapper[4724]: I1002 13:03:36.298983 4724 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-x7mvk"] Oct 02 13:03:36 crc kubenswrapper[4724]: I1002 13:03:36.299428 4724 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-x7mvk" podUID="d1808303-74a2-424b-9dd4-64838d28a1c7" containerName="registry-server" containerID="cri-o://451e7e9fdef3df62f41e5fee8b3cb9fe314f91b1b4872d3f76eb90260d7d5241" gracePeriod=30 Oct 02 13:03:36 crc kubenswrapper[4724]: I1002 13:03:36.355487 4724 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-j7cp6"] Oct 02 13:03:36 crc kubenswrapper[4724]: I1002 13:03:36.356626 4724 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-gffjg"] Oct 02 13:03:36 crc kubenswrapper[4724]: I1002 13:03:36.356934 4724 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-kpsxg"] Oct 02 13:03:36 crc kubenswrapper[4724]: I1002 13:03:36.356910 4724 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/marketplace-operator-79b997595-j7cp6" podUID="98449ccf-cf29-44ab-9400-994b04309bb5" containerName="marketplace-operator" containerID="cri-o://e77903f107437bc26aab15e6c9aef487702655bc215cf88d882a0c9b4ad8d561" gracePeriod=30 Oct 02 13:03:36 crc kubenswrapper[4724]: I1002 13:03:36.356972 4724 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-rp6lr"] Oct 02 13:03:36 crc kubenswrapper[4724]: I1002 13:03:36.357209 4724 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-kpsxg" podUID="b3d38739-e797-4866-aebe-290d90535c73" containerName="registry-server" containerID="cri-o://4496ef58a8798a432ef60da6b4fbd1e001ccd3736be9f4bf4b39819b54a99940" gracePeriod=30 Oct 02 13:03:36 crc kubenswrapper[4724]: I1002 13:03:36.357328 4724 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-gffjg" podUID="382d6bf1-1008-4b35-a7d2-fee3a1df7191" containerName="registry-server" containerID="cri-o://47d89ece7ae6d263a6533c50d8914c6ac85980c7b9cade19c90760330f95b71f" gracePeriod=30 Oct 02 13:03:36 crc kubenswrapper[4724]: I1002 13:03:36.358034 4724 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-rp6lr"] Oct 02 13:03:36 crc kubenswrapper[4724]: I1002 13:03:36.358100 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-rp6lr" Oct 02 13:03:36 crc kubenswrapper[4724]: I1002 13:03:36.432253 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/fc56248d-2826-4811-97eb-86c6ffa04f61-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-rp6lr\" (UID: \"fc56248d-2826-4811-97eb-86c6ffa04f61\") " pod="openshift-marketplace/marketplace-operator-79b997595-rp6lr" Oct 02 13:03:36 crc kubenswrapper[4724]: I1002 13:03:36.432389 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c4zb4\" (UniqueName: \"kubernetes.io/projected/fc56248d-2826-4811-97eb-86c6ffa04f61-kube-api-access-c4zb4\") pod \"marketplace-operator-79b997595-rp6lr\" (UID: \"fc56248d-2826-4811-97eb-86c6ffa04f61\") " pod="openshift-marketplace/marketplace-operator-79b997595-rp6lr" Oct 02 13:03:36 crc kubenswrapper[4724]: I1002 13:03:36.432428 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/fc56248d-2826-4811-97eb-86c6ffa04f61-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-rp6lr\" (UID: \"fc56248d-2826-4811-97eb-86c6ffa04f61\") " pod="openshift-marketplace/marketplace-operator-79b997595-rp6lr" Oct 02 13:03:36 crc kubenswrapper[4724]: I1002 13:03:36.533525 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/fc56248d-2826-4811-97eb-86c6ffa04f61-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-rp6lr\" (UID: \"fc56248d-2826-4811-97eb-86c6ffa04f61\") " pod="openshift-marketplace/marketplace-operator-79b997595-rp6lr" Oct 02 13:03:36 crc kubenswrapper[4724]: I1002 13:03:36.533622 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c4zb4\" (UniqueName: \"kubernetes.io/projected/fc56248d-2826-4811-97eb-86c6ffa04f61-kube-api-access-c4zb4\") pod \"marketplace-operator-79b997595-rp6lr\" (UID: \"fc56248d-2826-4811-97eb-86c6ffa04f61\") " pod="openshift-marketplace/marketplace-operator-79b997595-rp6lr" Oct 02 13:03:36 crc kubenswrapper[4724]: I1002 13:03:36.533654 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/fc56248d-2826-4811-97eb-86c6ffa04f61-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-rp6lr\" (UID: \"fc56248d-2826-4811-97eb-86c6ffa04f61\") " pod="openshift-marketplace/marketplace-operator-79b997595-rp6lr" Oct 02 13:03:36 crc kubenswrapper[4724]: I1002 13:03:36.534813 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/fc56248d-2826-4811-97eb-86c6ffa04f61-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-rp6lr\" (UID: \"fc56248d-2826-4811-97eb-86c6ffa04f61\") " pod="openshift-marketplace/marketplace-operator-79b997595-rp6lr" Oct 02 13:03:36 crc kubenswrapper[4724]: I1002 13:03:36.540755 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/fc56248d-2826-4811-97eb-86c6ffa04f61-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-rp6lr\" (UID: \"fc56248d-2826-4811-97eb-86c6ffa04f61\") " pod="openshift-marketplace/marketplace-operator-79b997595-rp6lr" Oct 02 13:03:36 crc kubenswrapper[4724]: I1002 13:03:36.554981 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c4zb4\" (UniqueName: \"kubernetes.io/projected/fc56248d-2826-4811-97eb-86c6ffa04f61-kube-api-access-c4zb4\") pod \"marketplace-operator-79b997595-rp6lr\" (UID: \"fc56248d-2826-4811-97eb-86c6ffa04f61\") " pod="openshift-marketplace/marketplace-operator-79b997595-rp6lr" Oct 02 13:03:36 crc kubenswrapper[4724]: I1002 13:03:36.677247 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-rp6lr" Oct 02 13:03:37 crc kubenswrapper[4724]: I1002 13:03:37.262386 4724 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-j7cp6" Oct 02 13:03:37 crc kubenswrapper[4724]: I1002 13:03:37.297762 4724 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-rp6lr"] Oct 02 13:03:37 crc kubenswrapper[4724]: I1002 13:03:37.344007 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/98449ccf-cf29-44ab-9400-994b04309bb5-marketplace-trusted-ca\") pod \"98449ccf-cf29-44ab-9400-994b04309bb5\" (UID: \"98449ccf-cf29-44ab-9400-994b04309bb5\") " Oct 02 13:03:37 crc kubenswrapper[4724]: I1002 13:03:37.344088 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/98449ccf-cf29-44ab-9400-994b04309bb5-marketplace-operator-metrics\") pod \"98449ccf-cf29-44ab-9400-994b04309bb5\" (UID: \"98449ccf-cf29-44ab-9400-994b04309bb5\") " Oct 02 13:03:37 crc kubenswrapper[4724]: I1002 13:03:37.344138 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5dlhl\" (UniqueName: \"kubernetes.io/projected/98449ccf-cf29-44ab-9400-994b04309bb5-kube-api-access-5dlhl\") pod \"98449ccf-cf29-44ab-9400-994b04309bb5\" (UID: \"98449ccf-cf29-44ab-9400-994b04309bb5\") " Oct 02 13:03:37 crc kubenswrapper[4724]: I1002 13:03:37.348091 4724 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/98449ccf-cf29-44ab-9400-994b04309bb5-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "98449ccf-cf29-44ab-9400-994b04309bb5" (UID: "98449ccf-cf29-44ab-9400-994b04309bb5"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 13:03:37 crc kubenswrapper[4724]: I1002 13:03:37.358471 4724 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/98449ccf-cf29-44ab-9400-994b04309bb5-kube-api-access-5dlhl" (OuterVolumeSpecName: "kube-api-access-5dlhl") pod "98449ccf-cf29-44ab-9400-994b04309bb5" (UID: "98449ccf-cf29-44ab-9400-994b04309bb5"). InnerVolumeSpecName "kube-api-access-5dlhl". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 13:03:37 crc kubenswrapper[4724]: I1002 13:03:37.362953 4724 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/98449ccf-cf29-44ab-9400-994b04309bb5-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "98449ccf-cf29-44ab-9400-994b04309bb5" (UID: "98449ccf-cf29-44ab-9400-994b04309bb5"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 13:03:37 crc kubenswrapper[4724]: I1002 13:03:37.448699 4724 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5dlhl\" (UniqueName: \"kubernetes.io/projected/98449ccf-cf29-44ab-9400-994b04309bb5-kube-api-access-5dlhl\") on node \"crc\" DevicePath \"\"" Oct 02 13:03:37 crc kubenswrapper[4724]: I1002 13:03:37.448725 4724 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/98449ccf-cf29-44ab-9400-994b04309bb5-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Oct 02 13:03:37 crc kubenswrapper[4724]: I1002 13:03:37.448734 4724 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/98449ccf-cf29-44ab-9400-994b04309bb5-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Oct 02 13:03:37 crc kubenswrapper[4724]: I1002 13:03:37.459802 4724 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-kpsxg" Oct 02 13:03:37 crc kubenswrapper[4724]: I1002 13:03:37.549682 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b3d38739-e797-4866-aebe-290d90535c73-utilities\") pod \"b3d38739-e797-4866-aebe-290d90535c73\" (UID: \"b3d38739-e797-4866-aebe-290d90535c73\") " Oct 02 13:03:37 crc kubenswrapper[4724]: I1002 13:03:37.549777 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cxvjs\" (UniqueName: \"kubernetes.io/projected/b3d38739-e797-4866-aebe-290d90535c73-kube-api-access-cxvjs\") pod \"b3d38739-e797-4866-aebe-290d90535c73\" (UID: \"b3d38739-e797-4866-aebe-290d90535c73\") " Oct 02 13:03:37 crc kubenswrapper[4724]: I1002 13:03:37.549807 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b3d38739-e797-4866-aebe-290d90535c73-catalog-content\") pod \"b3d38739-e797-4866-aebe-290d90535c73\" (UID: \"b3d38739-e797-4866-aebe-290d90535c73\") " Oct 02 13:03:37 crc kubenswrapper[4724]: I1002 13:03:37.550679 4724 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b3d38739-e797-4866-aebe-290d90535c73-utilities" (OuterVolumeSpecName: "utilities") pod "b3d38739-e797-4866-aebe-290d90535c73" (UID: "b3d38739-e797-4866-aebe-290d90535c73"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 13:03:37 crc kubenswrapper[4724]: I1002 13:03:37.553601 4724 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b3d38739-e797-4866-aebe-290d90535c73-kube-api-access-cxvjs" (OuterVolumeSpecName: "kube-api-access-cxvjs") pod "b3d38739-e797-4866-aebe-290d90535c73" (UID: "b3d38739-e797-4866-aebe-290d90535c73"). InnerVolumeSpecName "kube-api-access-cxvjs". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 13:03:37 crc kubenswrapper[4724]: I1002 13:03:37.594501 4724 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-gffjg" Oct 02 13:03:37 crc kubenswrapper[4724]: I1002 13:03:37.631869 4724 generic.go:334] "Generic (PLEG): container finished" podID="53ecbc18-3d93-4ee9-bd02-e3e99db2a82d" containerID="b27dc3cdb4ccad2c21ab78ceb037953bcfe9567e41c01e14e1d80d387123ee01" exitCode=0 Oct 02 13:03:37 crc kubenswrapper[4724]: I1002 13:03:37.631941 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vqs8m" event={"ID":"53ecbc18-3d93-4ee9-bd02-e3e99db2a82d","Type":"ContainerDied","Data":"b27dc3cdb4ccad2c21ab78ceb037953bcfe9567e41c01e14e1d80d387123ee01"} Oct 02 13:03:37 crc kubenswrapper[4724]: I1002 13:03:37.634231 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-rp6lr" event={"ID":"fc56248d-2826-4811-97eb-86c6ffa04f61","Type":"ContainerStarted","Data":"4028892ee5ce965e02a85e26ddf1694f8d5c809ec40ea7e88be98fa0d57f3634"} Oct 02 13:03:37 crc kubenswrapper[4724]: I1002 13:03:37.636561 4724 generic.go:334] "Generic (PLEG): container finished" podID="382d6bf1-1008-4b35-a7d2-fee3a1df7191" containerID="47d89ece7ae6d263a6533c50d8914c6ac85980c7b9cade19c90760330f95b71f" exitCode=0 Oct 02 13:03:37 crc kubenswrapper[4724]: I1002 13:03:37.636637 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gffjg" event={"ID":"382d6bf1-1008-4b35-a7d2-fee3a1df7191","Type":"ContainerDied","Data":"47d89ece7ae6d263a6533c50d8914c6ac85980c7b9cade19c90760330f95b71f"} Oct 02 13:03:37 crc kubenswrapper[4724]: I1002 13:03:37.636637 4724 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-gffjg" Oct 02 13:03:37 crc kubenswrapper[4724]: I1002 13:03:37.636675 4724 scope.go:117] "RemoveContainer" containerID="47d89ece7ae6d263a6533c50d8914c6ac85980c7b9cade19c90760330f95b71f" Oct 02 13:03:37 crc kubenswrapper[4724]: I1002 13:03:37.636664 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gffjg" event={"ID":"382d6bf1-1008-4b35-a7d2-fee3a1df7191","Type":"ContainerDied","Data":"237d227c42ea57dd21478ffc56dd9d7c02f9b240c797af1741daf03b752bb0c9"} Oct 02 13:03:37 crc kubenswrapper[4724]: I1002 13:03:37.639840 4724 generic.go:334] "Generic (PLEG): container finished" podID="98449ccf-cf29-44ab-9400-994b04309bb5" containerID="e77903f107437bc26aab15e6c9aef487702655bc215cf88d882a0c9b4ad8d561" exitCode=0 Oct 02 13:03:37 crc kubenswrapper[4724]: I1002 13:03:37.639902 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-j7cp6" event={"ID":"98449ccf-cf29-44ab-9400-994b04309bb5","Type":"ContainerDied","Data":"e77903f107437bc26aab15e6c9aef487702655bc215cf88d882a0c9b4ad8d561"} Oct 02 13:03:37 crc kubenswrapper[4724]: I1002 13:03:37.639923 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-j7cp6" event={"ID":"98449ccf-cf29-44ab-9400-994b04309bb5","Type":"ContainerDied","Data":"f0795d374d61fbbc89327ed845ee7cf6f1596a6f1b00cb5577bfb793908baa52"} Oct 02 13:03:37 crc kubenswrapper[4724]: I1002 13:03:37.639938 4724 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-j7cp6" Oct 02 13:03:37 crc kubenswrapper[4724]: I1002 13:03:37.642883 4724 generic.go:334] "Generic (PLEG): container finished" podID="b3d38739-e797-4866-aebe-290d90535c73" containerID="4496ef58a8798a432ef60da6b4fbd1e001ccd3736be9f4bf4b39819b54a99940" exitCode=0 Oct 02 13:03:37 crc kubenswrapper[4724]: I1002 13:03:37.642950 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kpsxg" event={"ID":"b3d38739-e797-4866-aebe-290d90535c73","Type":"ContainerDied","Data":"4496ef58a8798a432ef60da6b4fbd1e001ccd3736be9f4bf4b39819b54a99940"} Oct 02 13:03:37 crc kubenswrapper[4724]: I1002 13:03:37.642977 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kpsxg" event={"ID":"b3d38739-e797-4866-aebe-290d90535c73","Type":"ContainerDied","Data":"af41568fa000c117680a8c24b8be765baac88ed70e3cd427dc74304a1a794402"} Oct 02 13:03:37 crc kubenswrapper[4724]: I1002 13:03:37.643901 4724 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-kpsxg" Oct 02 13:03:37 crc kubenswrapper[4724]: I1002 13:03:37.647454 4724 generic.go:334] "Generic (PLEG): container finished" podID="d1808303-74a2-424b-9dd4-64838d28a1c7" containerID="451e7e9fdef3df62f41e5fee8b3cb9fe314f91b1b4872d3f76eb90260d7d5241" exitCode=0 Oct 02 13:03:37 crc kubenswrapper[4724]: I1002 13:03:37.647491 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-x7mvk" event={"ID":"d1808303-74a2-424b-9dd4-64838d28a1c7","Type":"ContainerDied","Data":"451e7e9fdef3df62f41e5fee8b3cb9fe314f91b1b4872d3f76eb90260d7d5241"} Oct 02 13:03:37 crc kubenswrapper[4724]: I1002 13:03:37.651082 4724 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b3d38739-e797-4866-aebe-290d90535c73-utilities\") on node \"crc\" DevicePath \"\"" Oct 02 13:03:37 crc kubenswrapper[4724]: I1002 13:03:37.651121 4724 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cxvjs\" (UniqueName: \"kubernetes.io/projected/b3d38739-e797-4866-aebe-290d90535c73-kube-api-access-cxvjs\") on node \"crc\" DevicePath \"\"" Oct 02 13:03:37 crc kubenswrapper[4724]: I1002 13:03:37.654274 4724 scope.go:117] "RemoveContainer" containerID="d598cbc6ca817552f224d0676636ea861ee347051c9832d6ec5efc10bb9f6c79" Oct 02 13:03:37 crc kubenswrapper[4724]: I1002 13:03:37.672069 4724 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b3d38739-e797-4866-aebe-290d90535c73-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b3d38739-e797-4866-aebe-290d90535c73" (UID: "b3d38739-e797-4866-aebe-290d90535c73"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 13:03:37 crc kubenswrapper[4724]: I1002 13:03:37.672195 4724 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-j7cp6"] Oct 02 13:03:37 crc kubenswrapper[4724]: I1002 13:03:37.674191 4724 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-j7cp6"] Oct 02 13:03:37 crc kubenswrapper[4724]: I1002 13:03:37.677733 4724 scope.go:117] "RemoveContainer" containerID="8673fdf725b29f43e6bcdaa68e8a5f2a24c2b78417df21a5234352edd816fc50" Oct 02 13:03:37 crc kubenswrapper[4724]: I1002 13:03:37.718364 4724 scope.go:117] "RemoveContainer" containerID="47d89ece7ae6d263a6533c50d8914c6ac85980c7b9cade19c90760330f95b71f" Oct 02 13:03:37 crc kubenswrapper[4724]: E1002 13:03:37.719833 4724 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"47d89ece7ae6d263a6533c50d8914c6ac85980c7b9cade19c90760330f95b71f\": container with ID starting with 47d89ece7ae6d263a6533c50d8914c6ac85980c7b9cade19c90760330f95b71f not found: ID does not exist" containerID="47d89ece7ae6d263a6533c50d8914c6ac85980c7b9cade19c90760330f95b71f" Oct 02 13:03:37 crc kubenswrapper[4724]: I1002 13:03:37.719901 4724 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"47d89ece7ae6d263a6533c50d8914c6ac85980c7b9cade19c90760330f95b71f"} err="failed to get container status \"47d89ece7ae6d263a6533c50d8914c6ac85980c7b9cade19c90760330f95b71f\": rpc error: code = NotFound desc = could not find container \"47d89ece7ae6d263a6533c50d8914c6ac85980c7b9cade19c90760330f95b71f\": container with ID starting with 47d89ece7ae6d263a6533c50d8914c6ac85980c7b9cade19c90760330f95b71f not found: ID does not exist" Oct 02 13:03:37 crc kubenswrapper[4724]: I1002 13:03:37.719932 4724 scope.go:117] "RemoveContainer" containerID="d598cbc6ca817552f224d0676636ea861ee347051c9832d6ec5efc10bb9f6c79" Oct 02 13:03:37 crc kubenswrapper[4724]: E1002 13:03:37.721490 4724 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d598cbc6ca817552f224d0676636ea861ee347051c9832d6ec5efc10bb9f6c79\": container with ID starting with d598cbc6ca817552f224d0676636ea861ee347051c9832d6ec5efc10bb9f6c79 not found: ID does not exist" containerID="d598cbc6ca817552f224d0676636ea861ee347051c9832d6ec5efc10bb9f6c79" Oct 02 13:03:37 crc kubenswrapper[4724]: I1002 13:03:37.721514 4724 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d598cbc6ca817552f224d0676636ea861ee347051c9832d6ec5efc10bb9f6c79"} err="failed to get container status \"d598cbc6ca817552f224d0676636ea861ee347051c9832d6ec5efc10bb9f6c79\": rpc error: code = NotFound desc = could not find container \"d598cbc6ca817552f224d0676636ea861ee347051c9832d6ec5efc10bb9f6c79\": container with ID starting with d598cbc6ca817552f224d0676636ea861ee347051c9832d6ec5efc10bb9f6c79 not found: ID does not exist" Oct 02 13:03:37 crc kubenswrapper[4724]: I1002 13:03:37.721528 4724 scope.go:117] "RemoveContainer" containerID="8673fdf725b29f43e6bcdaa68e8a5f2a24c2b78417df21a5234352edd816fc50" Oct 02 13:03:37 crc kubenswrapper[4724]: E1002 13:03:37.722231 4724 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8673fdf725b29f43e6bcdaa68e8a5f2a24c2b78417df21a5234352edd816fc50\": container with ID starting with 8673fdf725b29f43e6bcdaa68e8a5f2a24c2b78417df21a5234352edd816fc50 not found: ID does not exist" containerID="8673fdf725b29f43e6bcdaa68e8a5f2a24c2b78417df21a5234352edd816fc50" Oct 02 13:03:37 crc kubenswrapper[4724]: I1002 13:03:37.722256 4724 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8673fdf725b29f43e6bcdaa68e8a5f2a24c2b78417df21a5234352edd816fc50"} err="failed to get container status \"8673fdf725b29f43e6bcdaa68e8a5f2a24c2b78417df21a5234352edd816fc50\": rpc error: code = NotFound desc = could not find container \"8673fdf725b29f43e6bcdaa68e8a5f2a24c2b78417df21a5234352edd816fc50\": container with ID starting with 8673fdf725b29f43e6bcdaa68e8a5f2a24c2b78417df21a5234352edd816fc50 not found: ID does not exist" Oct 02 13:03:37 crc kubenswrapper[4724]: I1002 13:03:37.722270 4724 scope.go:117] "RemoveContainer" containerID="e77903f107437bc26aab15e6c9aef487702655bc215cf88d882a0c9b4ad8d561" Oct 02 13:03:37 crc kubenswrapper[4724]: I1002 13:03:37.736750 4724 scope.go:117] "RemoveContainer" containerID="e77903f107437bc26aab15e6c9aef487702655bc215cf88d882a0c9b4ad8d561" Oct 02 13:03:37 crc kubenswrapper[4724]: E1002 13:03:37.738298 4724 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e77903f107437bc26aab15e6c9aef487702655bc215cf88d882a0c9b4ad8d561\": container with ID starting with e77903f107437bc26aab15e6c9aef487702655bc215cf88d882a0c9b4ad8d561 not found: ID does not exist" containerID="e77903f107437bc26aab15e6c9aef487702655bc215cf88d882a0c9b4ad8d561" Oct 02 13:03:37 crc kubenswrapper[4724]: I1002 13:03:37.738334 4724 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e77903f107437bc26aab15e6c9aef487702655bc215cf88d882a0c9b4ad8d561"} err="failed to get container status \"e77903f107437bc26aab15e6c9aef487702655bc215cf88d882a0c9b4ad8d561\": rpc error: code = NotFound desc = could not find container \"e77903f107437bc26aab15e6c9aef487702655bc215cf88d882a0c9b4ad8d561\": container with ID starting with e77903f107437bc26aab15e6c9aef487702655bc215cf88d882a0c9b4ad8d561 not found: ID does not exist" Oct 02 13:03:37 crc kubenswrapper[4724]: I1002 13:03:37.738354 4724 scope.go:117] "RemoveContainer" containerID="4496ef58a8798a432ef60da6b4fbd1e001ccd3736be9f4bf4b39819b54a99940" Oct 02 13:03:37 crc kubenswrapper[4724]: I1002 13:03:37.740254 4724 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-x7mvk" Oct 02 13:03:37 crc kubenswrapper[4724]: I1002 13:03:37.751806 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/382d6bf1-1008-4b35-a7d2-fee3a1df7191-catalog-content\") pod \"382d6bf1-1008-4b35-a7d2-fee3a1df7191\" (UID: \"382d6bf1-1008-4b35-a7d2-fee3a1df7191\") " Oct 02 13:03:37 crc kubenswrapper[4724]: I1002 13:03:37.751862 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-spzpn\" (UniqueName: \"kubernetes.io/projected/382d6bf1-1008-4b35-a7d2-fee3a1df7191-kube-api-access-spzpn\") pod \"382d6bf1-1008-4b35-a7d2-fee3a1df7191\" (UID: \"382d6bf1-1008-4b35-a7d2-fee3a1df7191\") " Oct 02 13:03:37 crc kubenswrapper[4724]: I1002 13:03:37.752005 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/382d6bf1-1008-4b35-a7d2-fee3a1df7191-utilities\") pod \"382d6bf1-1008-4b35-a7d2-fee3a1df7191\" (UID: \"382d6bf1-1008-4b35-a7d2-fee3a1df7191\") " Oct 02 13:03:37 crc kubenswrapper[4724]: I1002 13:03:37.752431 4724 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b3d38739-e797-4866-aebe-290d90535c73-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 02 13:03:37 crc kubenswrapper[4724]: I1002 13:03:37.753262 4724 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/382d6bf1-1008-4b35-a7d2-fee3a1df7191-utilities" (OuterVolumeSpecName: "utilities") pod "382d6bf1-1008-4b35-a7d2-fee3a1df7191" (UID: "382d6bf1-1008-4b35-a7d2-fee3a1df7191"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 13:03:37 crc kubenswrapper[4724]: I1002 13:03:37.760269 4724 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/382d6bf1-1008-4b35-a7d2-fee3a1df7191-kube-api-access-spzpn" (OuterVolumeSpecName: "kube-api-access-spzpn") pod "382d6bf1-1008-4b35-a7d2-fee3a1df7191" (UID: "382d6bf1-1008-4b35-a7d2-fee3a1df7191"). InnerVolumeSpecName "kube-api-access-spzpn". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 13:03:37 crc kubenswrapper[4724]: I1002 13:03:37.768178 4724 scope.go:117] "RemoveContainer" containerID="a8cbf453cb0e997f36585f1b90a2df739aaed41783c30af23758218d58accbab" Oct 02 13:03:37 crc kubenswrapper[4724]: I1002 13:03:37.775000 4724 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/382d6bf1-1008-4b35-a7d2-fee3a1df7191-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "382d6bf1-1008-4b35-a7d2-fee3a1df7191" (UID: "382d6bf1-1008-4b35-a7d2-fee3a1df7191"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 13:03:37 crc kubenswrapper[4724]: I1002 13:03:37.790348 4724 scope.go:117] "RemoveContainer" containerID="37fed1d5a662fe6c9c70f4d8be320e774dcb243fcd96aca6c1797d9e7b2d6668" Oct 02 13:03:37 crc kubenswrapper[4724]: I1002 13:03:37.792137 4724 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-vqs8m" Oct 02 13:03:37 crc kubenswrapper[4724]: I1002 13:03:37.810912 4724 scope.go:117] "RemoveContainer" containerID="4496ef58a8798a432ef60da6b4fbd1e001ccd3736be9f4bf4b39819b54a99940" Oct 02 13:03:37 crc kubenswrapper[4724]: E1002 13:03:37.812243 4724 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4496ef58a8798a432ef60da6b4fbd1e001ccd3736be9f4bf4b39819b54a99940\": container with ID starting with 4496ef58a8798a432ef60da6b4fbd1e001ccd3736be9f4bf4b39819b54a99940 not found: ID does not exist" containerID="4496ef58a8798a432ef60da6b4fbd1e001ccd3736be9f4bf4b39819b54a99940" Oct 02 13:03:37 crc kubenswrapper[4724]: I1002 13:03:37.812433 4724 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4496ef58a8798a432ef60da6b4fbd1e001ccd3736be9f4bf4b39819b54a99940"} err="failed to get container status \"4496ef58a8798a432ef60da6b4fbd1e001ccd3736be9f4bf4b39819b54a99940\": rpc error: code = NotFound desc = could not find container \"4496ef58a8798a432ef60da6b4fbd1e001ccd3736be9f4bf4b39819b54a99940\": container with ID starting with 4496ef58a8798a432ef60da6b4fbd1e001ccd3736be9f4bf4b39819b54a99940 not found: ID does not exist" Oct 02 13:03:37 crc kubenswrapper[4724]: I1002 13:03:37.812458 4724 scope.go:117] "RemoveContainer" containerID="a8cbf453cb0e997f36585f1b90a2df739aaed41783c30af23758218d58accbab" Oct 02 13:03:37 crc kubenswrapper[4724]: E1002 13:03:37.814732 4724 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a8cbf453cb0e997f36585f1b90a2df739aaed41783c30af23758218d58accbab\": container with ID starting with a8cbf453cb0e997f36585f1b90a2df739aaed41783c30af23758218d58accbab not found: ID does not exist" containerID="a8cbf453cb0e997f36585f1b90a2df739aaed41783c30af23758218d58accbab" Oct 02 13:03:37 crc kubenswrapper[4724]: I1002 13:03:37.814762 4724 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a8cbf453cb0e997f36585f1b90a2df739aaed41783c30af23758218d58accbab"} err="failed to get container status \"a8cbf453cb0e997f36585f1b90a2df739aaed41783c30af23758218d58accbab\": rpc error: code = NotFound desc = could not find container \"a8cbf453cb0e997f36585f1b90a2df739aaed41783c30af23758218d58accbab\": container with ID starting with a8cbf453cb0e997f36585f1b90a2df739aaed41783c30af23758218d58accbab not found: ID does not exist" Oct 02 13:03:37 crc kubenswrapper[4724]: I1002 13:03:37.814777 4724 scope.go:117] "RemoveContainer" containerID="37fed1d5a662fe6c9c70f4d8be320e774dcb243fcd96aca6c1797d9e7b2d6668" Oct 02 13:03:37 crc kubenswrapper[4724]: E1002 13:03:37.821802 4724 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"37fed1d5a662fe6c9c70f4d8be320e774dcb243fcd96aca6c1797d9e7b2d6668\": container with ID starting with 37fed1d5a662fe6c9c70f4d8be320e774dcb243fcd96aca6c1797d9e7b2d6668 not found: ID does not exist" containerID="37fed1d5a662fe6c9c70f4d8be320e774dcb243fcd96aca6c1797d9e7b2d6668" Oct 02 13:03:37 crc kubenswrapper[4724]: I1002 13:03:37.821846 4724 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"37fed1d5a662fe6c9c70f4d8be320e774dcb243fcd96aca6c1797d9e7b2d6668"} err="failed to get container status \"37fed1d5a662fe6c9c70f4d8be320e774dcb243fcd96aca6c1797d9e7b2d6668\": rpc error: code = NotFound desc = could not find container \"37fed1d5a662fe6c9c70f4d8be320e774dcb243fcd96aca6c1797d9e7b2d6668\": container with ID starting with 37fed1d5a662fe6c9c70f4d8be320e774dcb243fcd96aca6c1797d9e7b2d6668 not found: ID does not exist" Oct 02 13:03:37 crc kubenswrapper[4724]: I1002 13:03:37.853081 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d1808303-74a2-424b-9dd4-64838d28a1c7-utilities\") pod \"d1808303-74a2-424b-9dd4-64838d28a1c7\" (UID: \"d1808303-74a2-424b-9dd4-64838d28a1c7\") " Oct 02 13:03:37 crc kubenswrapper[4724]: I1002 13:03:37.853166 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-whnfx\" (UniqueName: \"kubernetes.io/projected/d1808303-74a2-424b-9dd4-64838d28a1c7-kube-api-access-whnfx\") pod \"d1808303-74a2-424b-9dd4-64838d28a1c7\" (UID: \"d1808303-74a2-424b-9dd4-64838d28a1c7\") " Oct 02 13:03:37 crc kubenswrapper[4724]: I1002 13:03:37.853195 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d1808303-74a2-424b-9dd4-64838d28a1c7-catalog-content\") pod \"d1808303-74a2-424b-9dd4-64838d28a1c7\" (UID: \"d1808303-74a2-424b-9dd4-64838d28a1c7\") " Oct 02 13:03:37 crc kubenswrapper[4724]: I1002 13:03:37.853484 4724 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/382d6bf1-1008-4b35-a7d2-fee3a1df7191-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 02 13:03:37 crc kubenswrapper[4724]: I1002 13:03:37.853503 4724 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-spzpn\" (UniqueName: \"kubernetes.io/projected/382d6bf1-1008-4b35-a7d2-fee3a1df7191-kube-api-access-spzpn\") on node \"crc\" DevicePath \"\"" Oct 02 13:03:37 crc kubenswrapper[4724]: I1002 13:03:37.853517 4724 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/382d6bf1-1008-4b35-a7d2-fee3a1df7191-utilities\") on node \"crc\" DevicePath \"\"" Oct 02 13:03:37 crc kubenswrapper[4724]: I1002 13:03:37.853911 4724 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d1808303-74a2-424b-9dd4-64838d28a1c7-utilities" (OuterVolumeSpecName: "utilities") pod "d1808303-74a2-424b-9dd4-64838d28a1c7" (UID: "d1808303-74a2-424b-9dd4-64838d28a1c7"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 13:03:37 crc kubenswrapper[4724]: I1002 13:03:37.855872 4724 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d1808303-74a2-424b-9dd4-64838d28a1c7-kube-api-access-whnfx" (OuterVolumeSpecName: "kube-api-access-whnfx") pod "d1808303-74a2-424b-9dd4-64838d28a1c7" (UID: "d1808303-74a2-424b-9dd4-64838d28a1c7"). InnerVolumeSpecName "kube-api-access-whnfx". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 13:03:37 crc kubenswrapper[4724]: I1002 13:03:37.900230 4724 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d1808303-74a2-424b-9dd4-64838d28a1c7-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d1808303-74a2-424b-9dd4-64838d28a1c7" (UID: "d1808303-74a2-424b-9dd4-64838d28a1c7"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 13:03:37 crc kubenswrapper[4724]: I1002 13:03:37.954794 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bn79v\" (UniqueName: \"kubernetes.io/projected/53ecbc18-3d93-4ee9-bd02-e3e99db2a82d-kube-api-access-bn79v\") pod \"53ecbc18-3d93-4ee9-bd02-e3e99db2a82d\" (UID: \"53ecbc18-3d93-4ee9-bd02-e3e99db2a82d\") " Oct 02 13:03:37 crc kubenswrapper[4724]: I1002 13:03:37.954854 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/53ecbc18-3d93-4ee9-bd02-e3e99db2a82d-utilities\") pod \"53ecbc18-3d93-4ee9-bd02-e3e99db2a82d\" (UID: \"53ecbc18-3d93-4ee9-bd02-e3e99db2a82d\") " Oct 02 13:03:37 crc kubenswrapper[4724]: I1002 13:03:37.954888 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/53ecbc18-3d93-4ee9-bd02-e3e99db2a82d-catalog-content\") pod \"53ecbc18-3d93-4ee9-bd02-e3e99db2a82d\" (UID: \"53ecbc18-3d93-4ee9-bd02-e3e99db2a82d\") " Oct 02 13:03:37 crc kubenswrapper[4724]: I1002 13:03:37.955141 4724 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d1808303-74a2-424b-9dd4-64838d28a1c7-utilities\") on node \"crc\" DevicePath \"\"" Oct 02 13:03:37 crc kubenswrapper[4724]: I1002 13:03:37.955162 4724 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-whnfx\" (UniqueName: \"kubernetes.io/projected/d1808303-74a2-424b-9dd4-64838d28a1c7-kube-api-access-whnfx\") on node \"crc\" DevicePath \"\"" Oct 02 13:03:37 crc kubenswrapper[4724]: I1002 13:03:37.955173 4724 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d1808303-74a2-424b-9dd4-64838d28a1c7-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 02 13:03:37 crc kubenswrapper[4724]: I1002 13:03:37.955664 4724 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/53ecbc18-3d93-4ee9-bd02-e3e99db2a82d-utilities" (OuterVolumeSpecName: "utilities") pod "53ecbc18-3d93-4ee9-bd02-e3e99db2a82d" (UID: "53ecbc18-3d93-4ee9-bd02-e3e99db2a82d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 13:03:37 crc kubenswrapper[4724]: I1002 13:03:37.959293 4724 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/53ecbc18-3d93-4ee9-bd02-e3e99db2a82d-kube-api-access-bn79v" (OuterVolumeSpecName: "kube-api-access-bn79v") pod "53ecbc18-3d93-4ee9-bd02-e3e99db2a82d" (UID: "53ecbc18-3d93-4ee9-bd02-e3e99db2a82d"). InnerVolumeSpecName "kube-api-access-bn79v". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 13:03:37 crc kubenswrapper[4724]: I1002 13:03:37.970703 4724 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-gffjg"] Oct 02 13:03:37 crc kubenswrapper[4724]: I1002 13:03:37.974198 4724 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-gffjg"] Oct 02 13:03:37 crc kubenswrapper[4724]: I1002 13:03:37.981423 4724 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-kpsxg"] Oct 02 13:03:37 crc kubenswrapper[4724]: I1002 13:03:37.986579 4724 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-kpsxg"] Oct 02 13:03:38 crc kubenswrapper[4724]: I1002 13:03:38.010999 4724 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/53ecbc18-3d93-4ee9-bd02-e3e99db2a82d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "53ecbc18-3d93-4ee9-bd02-e3e99db2a82d" (UID: "53ecbc18-3d93-4ee9-bd02-e3e99db2a82d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 13:03:38 crc kubenswrapper[4724]: I1002 13:03:38.055978 4724 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bn79v\" (UniqueName: \"kubernetes.io/projected/53ecbc18-3d93-4ee9-bd02-e3e99db2a82d-kube-api-access-bn79v\") on node \"crc\" DevicePath \"\"" Oct 02 13:03:38 crc kubenswrapper[4724]: I1002 13:03:38.056047 4724 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/53ecbc18-3d93-4ee9-bd02-e3e99db2a82d-utilities\") on node \"crc\" DevicePath \"\"" Oct 02 13:03:38 crc kubenswrapper[4724]: I1002 13:03:38.056065 4724 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/53ecbc18-3d93-4ee9-bd02-e3e99db2a82d-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 02 13:03:38 crc kubenswrapper[4724]: I1002 13:03:38.321525 4724 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="382d6bf1-1008-4b35-a7d2-fee3a1df7191" path="/var/lib/kubelet/pods/382d6bf1-1008-4b35-a7d2-fee3a1df7191/volumes" Oct 02 13:03:38 crc kubenswrapper[4724]: I1002 13:03:38.323255 4724 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="98449ccf-cf29-44ab-9400-994b04309bb5" path="/var/lib/kubelet/pods/98449ccf-cf29-44ab-9400-994b04309bb5/volumes" Oct 02 13:03:38 crc kubenswrapper[4724]: I1002 13:03:38.323915 4724 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b3d38739-e797-4866-aebe-290d90535c73" path="/var/lib/kubelet/pods/b3d38739-e797-4866-aebe-290d90535c73/volumes" Oct 02 13:03:38 crc kubenswrapper[4724]: I1002 13:03:38.466379 4724 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-t7kvp"] Oct 02 13:03:38 crc kubenswrapper[4724]: E1002 13:03:38.466611 4724 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d1808303-74a2-424b-9dd4-64838d28a1c7" containerName="registry-server" Oct 02 13:03:38 crc kubenswrapper[4724]: I1002 13:03:38.466627 4724 state_mem.go:107] "Deleted CPUSet assignment" podUID="d1808303-74a2-424b-9dd4-64838d28a1c7" containerName="registry-server" Oct 02 13:03:38 crc kubenswrapper[4724]: E1002 13:03:38.466641 4724 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="53ecbc18-3d93-4ee9-bd02-e3e99db2a82d" containerName="extract-content" Oct 02 13:03:38 crc kubenswrapper[4724]: I1002 13:03:38.466649 4724 state_mem.go:107] "Deleted CPUSet assignment" podUID="53ecbc18-3d93-4ee9-bd02-e3e99db2a82d" containerName="extract-content" Oct 02 13:03:38 crc kubenswrapper[4724]: E1002 13:03:38.466663 4724 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="53ecbc18-3d93-4ee9-bd02-e3e99db2a82d" containerName="extract-utilities" Oct 02 13:03:38 crc kubenswrapper[4724]: I1002 13:03:38.466671 4724 state_mem.go:107] "Deleted CPUSet assignment" podUID="53ecbc18-3d93-4ee9-bd02-e3e99db2a82d" containerName="extract-utilities" Oct 02 13:03:38 crc kubenswrapper[4724]: E1002 13:03:38.466680 4724 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b3d38739-e797-4866-aebe-290d90535c73" containerName="extract-utilities" Oct 02 13:03:38 crc kubenswrapper[4724]: I1002 13:03:38.466687 4724 state_mem.go:107] "Deleted CPUSet assignment" podUID="b3d38739-e797-4866-aebe-290d90535c73" containerName="extract-utilities" Oct 02 13:03:38 crc kubenswrapper[4724]: E1002 13:03:38.466701 4724 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d1808303-74a2-424b-9dd4-64838d28a1c7" containerName="extract-content" Oct 02 13:03:38 crc kubenswrapper[4724]: I1002 13:03:38.466709 4724 state_mem.go:107] "Deleted CPUSet assignment" podUID="d1808303-74a2-424b-9dd4-64838d28a1c7" containerName="extract-content" Oct 02 13:03:38 crc kubenswrapper[4724]: E1002 13:03:38.466731 4724 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="53ecbc18-3d93-4ee9-bd02-e3e99db2a82d" containerName="registry-server" Oct 02 13:03:38 crc kubenswrapper[4724]: I1002 13:03:38.466738 4724 state_mem.go:107] "Deleted CPUSet assignment" podUID="53ecbc18-3d93-4ee9-bd02-e3e99db2a82d" containerName="registry-server" Oct 02 13:03:38 crc kubenswrapper[4724]: E1002 13:03:38.466745 4724 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d1808303-74a2-424b-9dd4-64838d28a1c7" containerName="extract-utilities" Oct 02 13:03:38 crc kubenswrapper[4724]: I1002 13:03:38.466753 4724 state_mem.go:107] "Deleted CPUSet assignment" podUID="d1808303-74a2-424b-9dd4-64838d28a1c7" containerName="extract-utilities" Oct 02 13:03:38 crc kubenswrapper[4724]: E1002 13:03:38.466764 4724 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="382d6bf1-1008-4b35-a7d2-fee3a1df7191" containerName="registry-server" Oct 02 13:03:38 crc kubenswrapper[4724]: I1002 13:03:38.466771 4724 state_mem.go:107] "Deleted CPUSet assignment" podUID="382d6bf1-1008-4b35-a7d2-fee3a1df7191" containerName="registry-server" Oct 02 13:03:38 crc kubenswrapper[4724]: E1002 13:03:38.466782 4724 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b3d38739-e797-4866-aebe-290d90535c73" containerName="registry-server" Oct 02 13:03:38 crc kubenswrapper[4724]: I1002 13:03:38.466789 4724 state_mem.go:107] "Deleted CPUSet assignment" podUID="b3d38739-e797-4866-aebe-290d90535c73" containerName="registry-server" Oct 02 13:03:38 crc kubenswrapper[4724]: E1002 13:03:38.466801 4724 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="98449ccf-cf29-44ab-9400-994b04309bb5" containerName="marketplace-operator" Oct 02 13:03:38 crc kubenswrapper[4724]: I1002 13:03:38.466808 4724 state_mem.go:107] "Deleted CPUSet assignment" podUID="98449ccf-cf29-44ab-9400-994b04309bb5" containerName="marketplace-operator" Oct 02 13:03:38 crc kubenswrapper[4724]: E1002 13:03:38.466818 4724 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b3d38739-e797-4866-aebe-290d90535c73" containerName="extract-content" Oct 02 13:03:38 crc kubenswrapper[4724]: I1002 13:03:38.466825 4724 state_mem.go:107] "Deleted CPUSet assignment" podUID="b3d38739-e797-4866-aebe-290d90535c73" containerName="extract-content" Oct 02 13:03:38 crc kubenswrapper[4724]: E1002 13:03:38.466835 4724 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="382d6bf1-1008-4b35-a7d2-fee3a1df7191" containerName="extract-utilities" Oct 02 13:03:38 crc kubenswrapper[4724]: I1002 13:03:38.466841 4724 state_mem.go:107] "Deleted CPUSet assignment" podUID="382d6bf1-1008-4b35-a7d2-fee3a1df7191" containerName="extract-utilities" Oct 02 13:03:38 crc kubenswrapper[4724]: E1002 13:03:38.466846 4724 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="382d6bf1-1008-4b35-a7d2-fee3a1df7191" containerName="extract-content" Oct 02 13:03:38 crc kubenswrapper[4724]: I1002 13:03:38.466853 4724 state_mem.go:107] "Deleted CPUSet assignment" podUID="382d6bf1-1008-4b35-a7d2-fee3a1df7191" containerName="extract-content" Oct 02 13:03:38 crc kubenswrapper[4724]: I1002 13:03:38.466930 4724 memory_manager.go:354] "RemoveStaleState removing state" podUID="382d6bf1-1008-4b35-a7d2-fee3a1df7191" containerName="registry-server" Oct 02 13:03:38 crc kubenswrapper[4724]: I1002 13:03:38.466942 4724 memory_manager.go:354] "RemoveStaleState removing state" podUID="d1808303-74a2-424b-9dd4-64838d28a1c7" containerName="registry-server" Oct 02 13:03:38 crc kubenswrapper[4724]: I1002 13:03:38.466950 4724 memory_manager.go:354] "RemoveStaleState removing state" podUID="b3d38739-e797-4866-aebe-290d90535c73" containerName="registry-server" Oct 02 13:03:38 crc kubenswrapper[4724]: I1002 13:03:38.466960 4724 memory_manager.go:354] "RemoveStaleState removing state" podUID="53ecbc18-3d93-4ee9-bd02-e3e99db2a82d" containerName="registry-server" Oct 02 13:03:38 crc kubenswrapper[4724]: I1002 13:03:38.466967 4724 memory_manager.go:354] "RemoveStaleState removing state" podUID="98449ccf-cf29-44ab-9400-994b04309bb5" containerName="marketplace-operator" Oct 02 13:03:38 crc kubenswrapper[4724]: I1002 13:03:38.467615 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-t7kvp" Oct 02 13:03:38 crc kubenswrapper[4724]: I1002 13:03:38.469305 4724 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Oct 02 13:03:38 crc kubenswrapper[4724]: I1002 13:03:38.475347 4724 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-t7kvp"] Oct 02 13:03:38 crc kubenswrapper[4724]: I1002 13:03:38.561280 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/916861ea-15de-4a65-b053-4474fb141748-catalog-content\") pod \"redhat-marketplace-t7kvp\" (UID: \"916861ea-15de-4a65-b053-4474fb141748\") " pod="openshift-marketplace/redhat-marketplace-t7kvp" Oct 02 13:03:38 crc kubenswrapper[4724]: I1002 13:03:38.561352 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/916861ea-15de-4a65-b053-4474fb141748-utilities\") pod \"redhat-marketplace-t7kvp\" (UID: \"916861ea-15de-4a65-b053-4474fb141748\") " pod="openshift-marketplace/redhat-marketplace-t7kvp" Oct 02 13:03:38 crc kubenswrapper[4724]: I1002 13:03:38.561415 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qx969\" (UniqueName: \"kubernetes.io/projected/916861ea-15de-4a65-b053-4474fb141748-kube-api-access-qx969\") pod \"redhat-marketplace-t7kvp\" (UID: \"916861ea-15de-4a65-b053-4474fb141748\") " pod="openshift-marketplace/redhat-marketplace-t7kvp" Oct 02 13:03:38 crc kubenswrapper[4724]: I1002 13:03:38.657161 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-x7mvk" event={"ID":"d1808303-74a2-424b-9dd4-64838d28a1c7","Type":"ContainerDied","Data":"6cbe463c41af50340c677cd9ac3c3e8de0bb2f42ce683760e82c9192fe790af2"} Oct 02 13:03:38 crc kubenswrapper[4724]: I1002 13:03:38.657243 4724 scope.go:117] "RemoveContainer" containerID="451e7e9fdef3df62f41e5fee8b3cb9fe314f91b1b4872d3f76eb90260d7d5241" Oct 02 13:03:38 crc kubenswrapper[4724]: I1002 13:03:38.657180 4724 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-x7mvk" Oct 02 13:03:38 crc kubenswrapper[4724]: I1002 13:03:38.663446 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/916861ea-15de-4a65-b053-4474fb141748-catalog-content\") pod \"redhat-marketplace-t7kvp\" (UID: \"916861ea-15de-4a65-b053-4474fb141748\") " pod="openshift-marketplace/redhat-marketplace-t7kvp" Oct 02 13:03:38 crc kubenswrapper[4724]: I1002 13:03:38.663669 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/916861ea-15de-4a65-b053-4474fb141748-utilities\") pod \"redhat-marketplace-t7kvp\" (UID: \"916861ea-15de-4a65-b053-4474fb141748\") " pod="openshift-marketplace/redhat-marketplace-t7kvp" Oct 02 13:03:38 crc kubenswrapper[4724]: I1002 13:03:38.663931 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qx969\" (UniqueName: \"kubernetes.io/projected/916861ea-15de-4a65-b053-4474fb141748-kube-api-access-qx969\") pod \"redhat-marketplace-t7kvp\" (UID: \"916861ea-15de-4a65-b053-4474fb141748\") " pod="openshift-marketplace/redhat-marketplace-t7kvp" Oct 02 13:03:38 crc kubenswrapper[4724]: I1002 13:03:38.665151 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vqs8m" event={"ID":"53ecbc18-3d93-4ee9-bd02-e3e99db2a82d","Type":"ContainerDied","Data":"47dfa156b3695a99d262407baf1ed396e38a4bc4571440136a6c353b227f4692"} Oct 02 13:03:38 crc kubenswrapper[4724]: I1002 13:03:38.665282 4724 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-vqs8m" Oct 02 13:03:38 crc kubenswrapper[4724]: I1002 13:03:38.665607 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/916861ea-15de-4a65-b053-4474fb141748-utilities\") pod \"redhat-marketplace-t7kvp\" (UID: \"916861ea-15de-4a65-b053-4474fb141748\") " pod="openshift-marketplace/redhat-marketplace-t7kvp" Oct 02 13:03:38 crc kubenswrapper[4724]: I1002 13:03:38.665908 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/916861ea-15de-4a65-b053-4474fb141748-catalog-content\") pod \"redhat-marketplace-t7kvp\" (UID: \"916861ea-15de-4a65-b053-4474fb141748\") " pod="openshift-marketplace/redhat-marketplace-t7kvp" Oct 02 13:03:38 crc kubenswrapper[4724]: I1002 13:03:38.680724 4724 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-qvzc9"] Oct 02 13:03:38 crc kubenswrapper[4724]: I1002 13:03:38.681746 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-qvzc9" Oct 02 13:03:38 crc kubenswrapper[4724]: I1002 13:03:38.682476 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-rp6lr" event={"ID":"fc56248d-2826-4811-97eb-86c6ffa04f61","Type":"ContainerStarted","Data":"8cdb0ea4a9e85bea201581f4544aa3c56826dedc3e8851a89b6d3c49710f826c"} Oct 02 13:03:38 crc kubenswrapper[4724]: I1002 13:03:38.683748 4724 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-rp6lr" Oct 02 13:03:38 crc kubenswrapper[4724]: I1002 13:03:38.686450 4724 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Oct 02 13:03:38 crc kubenswrapper[4724]: I1002 13:03:38.686773 4724 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-qvzc9"] Oct 02 13:03:38 crc kubenswrapper[4724]: I1002 13:03:38.690483 4724 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-rp6lr" Oct 02 13:03:38 crc kubenswrapper[4724]: I1002 13:03:38.692759 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qx969\" (UniqueName: \"kubernetes.io/projected/916861ea-15de-4a65-b053-4474fb141748-kube-api-access-qx969\") pod \"redhat-marketplace-t7kvp\" (UID: \"916861ea-15de-4a65-b053-4474fb141748\") " pod="openshift-marketplace/redhat-marketplace-t7kvp" Oct 02 13:03:38 crc kubenswrapper[4724]: I1002 13:03:38.692997 4724 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-x7mvk"] Oct 02 13:03:38 crc kubenswrapper[4724]: I1002 13:03:38.699302 4724 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-x7mvk"] Oct 02 13:03:38 crc kubenswrapper[4724]: I1002 13:03:38.708362 4724 scope.go:117] "RemoveContainer" containerID="c9c1b438e8cf8ec66ad0f7b7e17df718f9ca88abd6cd9615a3d8f747118aa1cf" Oct 02 13:03:38 crc kubenswrapper[4724]: I1002 13:03:38.714271 4724 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-rp6lr" podStartSLOduration=2.714253479 podStartE2EDuration="2.714253479s" podCreationTimestamp="2025-10-02 13:03:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 13:03:38.713306305 +0000 UTC m=+283.168065426" watchObservedRunningTime="2025-10-02 13:03:38.714253479 +0000 UTC m=+283.169012600" Oct 02 13:03:38 crc kubenswrapper[4724]: I1002 13:03:38.736990 4724 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-vqs8m"] Oct 02 13:03:38 crc kubenswrapper[4724]: I1002 13:03:38.738998 4724 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-vqs8m"] Oct 02 13:03:38 crc kubenswrapper[4724]: I1002 13:03:38.739229 4724 scope.go:117] "RemoveContainer" containerID="5c76ea0a4df2f657bdf82c01366fa9a0e66416917f2b5e476c5e5b9aeda8d94e" Oct 02 13:03:38 crc kubenswrapper[4724]: I1002 13:03:38.766126 4724 scope.go:117] "RemoveContainer" containerID="b27dc3cdb4ccad2c21ab78ceb037953bcfe9567e41c01e14e1d80d387123ee01" Oct 02 13:03:38 crc kubenswrapper[4724]: I1002 13:03:38.766640 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8ce22671-98d5-4e0e-9851-7da087e63499-utilities\") pod \"redhat-operators-qvzc9\" (UID: \"8ce22671-98d5-4e0e-9851-7da087e63499\") " pod="openshift-marketplace/redhat-operators-qvzc9" Oct 02 13:03:38 crc kubenswrapper[4724]: I1002 13:03:38.766740 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8zs2r\" (UniqueName: \"kubernetes.io/projected/8ce22671-98d5-4e0e-9851-7da087e63499-kube-api-access-8zs2r\") pod \"redhat-operators-qvzc9\" (UID: \"8ce22671-98d5-4e0e-9851-7da087e63499\") " pod="openshift-marketplace/redhat-operators-qvzc9" Oct 02 13:03:38 crc kubenswrapper[4724]: I1002 13:03:38.767071 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8ce22671-98d5-4e0e-9851-7da087e63499-catalog-content\") pod \"redhat-operators-qvzc9\" (UID: \"8ce22671-98d5-4e0e-9851-7da087e63499\") " pod="openshift-marketplace/redhat-operators-qvzc9" Oct 02 13:03:38 crc kubenswrapper[4724]: I1002 13:03:38.791262 4724 scope.go:117] "RemoveContainer" containerID="efb3cf44d6d4b54e217263186c9e9c56adaa91cd883d6257d3b9e3b7961a6dad" Oct 02 13:03:38 crc kubenswrapper[4724]: I1002 13:03:38.810896 4724 scope.go:117] "RemoveContainer" containerID="29a766e792b317ce1f19e9c92d3680cd801b8395ee30ccb12ef96e3bbf18f6b7" Oct 02 13:03:38 crc kubenswrapper[4724]: I1002 13:03:38.811343 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-t7kvp" Oct 02 13:03:38 crc kubenswrapper[4724]: I1002 13:03:38.869959 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8ce22671-98d5-4e0e-9851-7da087e63499-utilities\") pod \"redhat-operators-qvzc9\" (UID: \"8ce22671-98d5-4e0e-9851-7da087e63499\") " pod="openshift-marketplace/redhat-operators-qvzc9" Oct 02 13:03:38 crc kubenswrapper[4724]: I1002 13:03:38.870065 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8zs2r\" (UniqueName: \"kubernetes.io/projected/8ce22671-98d5-4e0e-9851-7da087e63499-kube-api-access-8zs2r\") pod \"redhat-operators-qvzc9\" (UID: \"8ce22671-98d5-4e0e-9851-7da087e63499\") " pod="openshift-marketplace/redhat-operators-qvzc9" Oct 02 13:03:38 crc kubenswrapper[4724]: I1002 13:03:38.870219 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8ce22671-98d5-4e0e-9851-7da087e63499-catalog-content\") pod \"redhat-operators-qvzc9\" (UID: \"8ce22671-98d5-4e0e-9851-7da087e63499\") " pod="openshift-marketplace/redhat-operators-qvzc9" Oct 02 13:03:38 crc kubenswrapper[4724]: I1002 13:03:38.870600 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8ce22671-98d5-4e0e-9851-7da087e63499-utilities\") pod \"redhat-operators-qvzc9\" (UID: \"8ce22671-98d5-4e0e-9851-7da087e63499\") " pod="openshift-marketplace/redhat-operators-qvzc9" Oct 02 13:03:38 crc kubenswrapper[4724]: I1002 13:03:38.871121 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8ce22671-98d5-4e0e-9851-7da087e63499-catalog-content\") pod \"redhat-operators-qvzc9\" (UID: \"8ce22671-98d5-4e0e-9851-7da087e63499\") " pod="openshift-marketplace/redhat-operators-qvzc9" Oct 02 13:03:38 crc kubenswrapper[4724]: I1002 13:03:38.886334 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8zs2r\" (UniqueName: \"kubernetes.io/projected/8ce22671-98d5-4e0e-9851-7da087e63499-kube-api-access-8zs2r\") pod \"redhat-operators-qvzc9\" (UID: \"8ce22671-98d5-4e0e-9851-7da087e63499\") " pod="openshift-marketplace/redhat-operators-qvzc9" Oct 02 13:03:39 crc kubenswrapper[4724]: I1002 13:03:39.050808 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-qvzc9" Oct 02 13:03:39 crc kubenswrapper[4724]: I1002 13:03:39.199504 4724 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-t7kvp"] Oct 02 13:03:39 crc kubenswrapper[4724]: W1002 13:03:39.211106 4724 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod916861ea_15de_4a65_b053_4474fb141748.slice/crio-ec23c4a7a72d8c03463b12d174ec87970af5640007f64116707ccd8bf87230d3 WatchSource:0}: Error finding container ec23c4a7a72d8c03463b12d174ec87970af5640007f64116707ccd8bf87230d3: Status 404 returned error can't find the container with id ec23c4a7a72d8c03463b12d174ec87970af5640007f64116707ccd8bf87230d3 Oct 02 13:03:39 crc kubenswrapper[4724]: I1002 13:03:39.426884 4724 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-qvzc9"] Oct 02 13:03:39 crc kubenswrapper[4724]: I1002 13:03:39.699016 4724 generic.go:334] "Generic (PLEG): container finished" podID="916861ea-15de-4a65-b053-4474fb141748" containerID="a52b2bc22eb174b0450145291be186bcde808328f373b4609fba8c0c39ae63eb" exitCode=0 Oct 02 13:03:39 crc kubenswrapper[4724]: I1002 13:03:39.699188 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-t7kvp" event={"ID":"916861ea-15de-4a65-b053-4474fb141748","Type":"ContainerDied","Data":"a52b2bc22eb174b0450145291be186bcde808328f373b4609fba8c0c39ae63eb"} Oct 02 13:03:39 crc kubenswrapper[4724]: I1002 13:03:39.699346 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-t7kvp" event={"ID":"916861ea-15de-4a65-b053-4474fb141748","Type":"ContainerStarted","Data":"ec23c4a7a72d8c03463b12d174ec87970af5640007f64116707ccd8bf87230d3"} Oct 02 13:03:39 crc kubenswrapper[4724]: I1002 13:03:39.704908 4724 generic.go:334] "Generic (PLEG): container finished" podID="8ce22671-98d5-4e0e-9851-7da087e63499" containerID="8738163ca81747b4acc611e282a4daa506428591d0094f1fee997e649997f0ab" exitCode=0 Oct 02 13:03:39 crc kubenswrapper[4724]: I1002 13:03:39.704998 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qvzc9" event={"ID":"8ce22671-98d5-4e0e-9851-7da087e63499","Type":"ContainerDied","Data":"8738163ca81747b4acc611e282a4daa506428591d0094f1fee997e649997f0ab"} Oct 02 13:03:39 crc kubenswrapper[4724]: I1002 13:03:39.705025 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qvzc9" event={"ID":"8ce22671-98d5-4e0e-9851-7da087e63499","Type":"ContainerStarted","Data":"e1bed5694c3423aac21b2bb5781b4804f24d22f170b53ba271787bd10dcaddd5"} Oct 02 13:03:40 crc kubenswrapper[4724]: I1002 13:03:40.323218 4724 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="53ecbc18-3d93-4ee9-bd02-e3e99db2a82d" path="/var/lib/kubelet/pods/53ecbc18-3d93-4ee9-bd02-e3e99db2a82d/volumes" Oct 02 13:03:40 crc kubenswrapper[4724]: I1002 13:03:40.324180 4724 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d1808303-74a2-424b-9dd4-64838d28a1c7" path="/var/lib/kubelet/pods/d1808303-74a2-424b-9dd4-64838d28a1c7/volumes" Oct 02 13:03:40 crc kubenswrapper[4724]: I1002 13:03:40.713237 4724 generic.go:334] "Generic (PLEG): container finished" podID="916861ea-15de-4a65-b053-4474fb141748" containerID="d3f396520472d9a09c9a88dfe3d0b301f2ba833f52affe6c5442a96ef35003d7" exitCode=0 Oct 02 13:03:40 crc kubenswrapper[4724]: I1002 13:03:40.713299 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-t7kvp" event={"ID":"916861ea-15de-4a65-b053-4474fb141748","Type":"ContainerDied","Data":"d3f396520472d9a09c9a88dfe3d0b301f2ba833f52affe6c5442a96ef35003d7"} Oct 02 13:03:40 crc kubenswrapper[4724]: I1002 13:03:40.867772 4724 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-fmg79"] Oct 02 13:03:40 crc kubenswrapper[4724]: I1002 13:03:40.869038 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-fmg79" Oct 02 13:03:40 crc kubenswrapper[4724]: I1002 13:03:40.871195 4724 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Oct 02 13:03:40 crc kubenswrapper[4724]: I1002 13:03:40.878374 4724 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-fmg79"] Oct 02 13:03:40 crc kubenswrapper[4724]: I1002 13:03:40.997636 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5a6d4c8e-6d6b-4da3-b36c-7f8c4bd6c469-catalog-content\") pod \"certified-operators-fmg79\" (UID: \"5a6d4c8e-6d6b-4da3-b36c-7f8c4bd6c469\") " pod="openshift-marketplace/certified-operators-fmg79" Oct 02 13:03:40 crc kubenswrapper[4724]: I1002 13:03:40.997700 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vfgt6\" (UniqueName: \"kubernetes.io/projected/5a6d4c8e-6d6b-4da3-b36c-7f8c4bd6c469-kube-api-access-vfgt6\") pod \"certified-operators-fmg79\" (UID: \"5a6d4c8e-6d6b-4da3-b36c-7f8c4bd6c469\") " pod="openshift-marketplace/certified-operators-fmg79" Oct 02 13:03:40 crc kubenswrapper[4724]: I1002 13:03:40.997724 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5a6d4c8e-6d6b-4da3-b36c-7f8c4bd6c469-utilities\") pod \"certified-operators-fmg79\" (UID: \"5a6d4c8e-6d6b-4da3-b36c-7f8c4bd6c469\") " pod="openshift-marketplace/certified-operators-fmg79" Oct 02 13:03:41 crc kubenswrapper[4724]: I1002 13:03:41.066997 4724 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-kz8p5"] Oct 02 13:03:41 crc kubenswrapper[4724]: I1002 13:03:41.068027 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-kz8p5" Oct 02 13:03:41 crc kubenswrapper[4724]: I1002 13:03:41.070250 4724 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Oct 02 13:03:41 crc kubenswrapper[4724]: I1002 13:03:41.078451 4724 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-kz8p5"] Oct 02 13:03:41 crc kubenswrapper[4724]: I1002 13:03:41.099364 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vfgt6\" (UniqueName: \"kubernetes.io/projected/5a6d4c8e-6d6b-4da3-b36c-7f8c4bd6c469-kube-api-access-vfgt6\") pod \"certified-operators-fmg79\" (UID: \"5a6d4c8e-6d6b-4da3-b36c-7f8c4bd6c469\") " pod="openshift-marketplace/certified-operators-fmg79" Oct 02 13:03:41 crc kubenswrapper[4724]: I1002 13:03:41.099420 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5a6d4c8e-6d6b-4da3-b36c-7f8c4bd6c469-utilities\") pod \"certified-operators-fmg79\" (UID: \"5a6d4c8e-6d6b-4da3-b36c-7f8c4bd6c469\") " pod="openshift-marketplace/certified-operators-fmg79" Oct 02 13:03:41 crc kubenswrapper[4724]: I1002 13:03:41.099482 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5a6d4c8e-6d6b-4da3-b36c-7f8c4bd6c469-catalog-content\") pod \"certified-operators-fmg79\" (UID: \"5a6d4c8e-6d6b-4da3-b36c-7f8c4bd6c469\") " pod="openshift-marketplace/certified-operators-fmg79" Oct 02 13:03:41 crc kubenswrapper[4724]: I1002 13:03:41.100017 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5a6d4c8e-6d6b-4da3-b36c-7f8c4bd6c469-catalog-content\") pod \"certified-operators-fmg79\" (UID: \"5a6d4c8e-6d6b-4da3-b36c-7f8c4bd6c469\") " pod="openshift-marketplace/certified-operators-fmg79" Oct 02 13:03:41 crc kubenswrapper[4724]: I1002 13:03:41.100017 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5a6d4c8e-6d6b-4da3-b36c-7f8c4bd6c469-utilities\") pod \"certified-operators-fmg79\" (UID: \"5a6d4c8e-6d6b-4da3-b36c-7f8c4bd6c469\") " pod="openshift-marketplace/certified-operators-fmg79" Oct 02 13:03:41 crc kubenswrapper[4724]: I1002 13:03:41.124428 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vfgt6\" (UniqueName: \"kubernetes.io/projected/5a6d4c8e-6d6b-4da3-b36c-7f8c4bd6c469-kube-api-access-vfgt6\") pod \"certified-operators-fmg79\" (UID: \"5a6d4c8e-6d6b-4da3-b36c-7f8c4bd6c469\") " pod="openshift-marketplace/certified-operators-fmg79" Oct 02 13:03:41 crc kubenswrapper[4724]: I1002 13:03:41.192215 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-fmg79" Oct 02 13:03:41 crc kubenswrapper[4724]: I1002 13:03:41.201169 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jp4th\" (UniqueName: \"kubernetes.io/projected/bf1c3eef-48b3-4563-99fa-0a33d1c6835a-kube-api-access-jp4th\") pod \"community-operators-kz8p5\" (UID: \"bf1c3eef-48b3-4563-99fa-0a33d1c6835a\") " pod="openshift-marketplace/community-operators-kz8p5" Oct 02 13:03:41 crc kubenswrapper[4724]: I1002 13:03:41.201227 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bf1c3eef-48b3-4563-99fa-0a33d1c6835a-catalog-content\") pod \"community-operators-kz8p5\" (UID: \"bf1c3eef-48b3-4563-99fa-0a33d1c6835a\") " pod="openshift-marketplace/community-operators-kz8p5" Oct 02 13:03:41 crc kubenswrapper[4724]: I1002 13:03:41.201273 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bf1c3eef-48b3-4563-99fa-0a33d1c6835a-utilities\") pod \"community-operators-kz8p5\" (UID: \"bf1c3eef-48b3-4563-99fa-0a33d1c6835a\") " pod="openshift-marketplace/community-operators-kz8p5" Oct 02 13:03:41 crc kubenswrapper[4724]: I1002 13:03:41.302152 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jp4th\" (UniqueName: \"kubernetes.io/projected/bf1c3eef-48b3-4563-99fa-0a33d1c6835a-kube-api-access-jp4th\") pod \"community-operators-kz8p5\" (UID: \"bf1c3eef-48b3-4563-99fa-0a33d1c6835a\") " pod="openshift-marketplace/community-operators-kz8p5" Oct 02 13:03:41 crc kubenswrapper[4724]: I1002 13:03:41.302475 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bf1c3eef-48b3-4563-99fa-0a33d1c6835a-catalog-content\") pod \"community-operators-kz8p5\" (UID: \"bf1c3eef-48b3-4563-99fa-0a33d1c6835a\") " pod="openshift-marketplace/community-operators-kz8p5" Oct 02 13:03:41 crc kubenswrapper[4724]: I1002 13:03:41.302515 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bf1c3eef-48b3-4563-99fa-0a33d1c6835a-utilities\") pod \"community-operators-kz8p5\" (UID: \"bf1c3eef-48b3-4563-99fa-0a33d1c6835a\") " pod="openshift-marketplace/community-operators-kz8p5" Oct 02 13:03:41 crc kubenswrapper[4724]: I1002 13:03:41.303026 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bf1c3eef-48b3-4563-99fa-0a33d1c6835a-utilities\") pod \"community-operators-kz8p5\" (UID: \"bf1c3eef-48b3-4563-99fa-0a33d1c6835a\") " pod="openshift-marketplace/community-operators-kz8p5" Oct 02 13:03:41 crc kubenswrapper[4724]: I1002 13:03:41.303184 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bf1c3eef-48b3-4563-99fa-0a33d1c6835a-catalog-content\") pod \"community-operators-kz8p5\" (UID: \"bf1c3eef-48b3-4563-99fa-0a33d1c6835a\") " pod="openshift-marketplace/community-operators-kz8p5" Oct 02 13:03:41 crc kubenswrapper[4724]: I1002 13:03:41.325526 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jp4th\" (UniqueName: \"kubernetes.io/projected/bf1c3eef-48b3-4563-99fa-0a33d1c6835a-kube-api-access-jp4th\") pod \"community-operators-kz8p5\" (UID: \"bf1c3eef-48b3-4563-99fa-0a33d1c6835a\") " pod="openshift-marketplace/community-operators-kz8p5" Oct 02 13:03:41 crc kubenswrapper[4724]: I1002 13:03:41.382054 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-kz8p5" Oct 02 13:03:41 crc kubenswrapper[4724]: I1002 13:03:41.643266 4724 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-fmg79"] Oct 02 13:03:41 crc kubenswrapper[4724]: I1002 13:03:41.688660 4724 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-kz8p5"] Oct 02 13:03:41 crc kubenswrapper[4724]: W1002 13:03:41.700167 4724 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbf1c3eef_48b3_4563_99fa_0a33d1c6835a.slice/crio-f0b77103e578efb53f6485b44a3bd1e2376140d5ea2ed73e92ce8d486f17bb65 WatchSource:0}: Error finding container f0b77103e578efb53f6485b44a3bd1e2376140d5ea2ed73e92ce8d486f17bb65: Status 404 returned error can't find the container with id f0b77103e578efb53f6485b44a3bd1e2376140d5ea2ed73e92ce8d486f17bb65 Oct 02 13:03:41 crc kubenswrapper[4724]: I1002 13:03:41.724521 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-fmg79" event={"ID":"5a6d4c8e-6d6b-4da3-b36c-7f8c4bd6c469","Type":"ContainerStarted","Data":"e3578ede78685f57e035eaf6942c991e9ecfa805f703abaeb135580e9f4dbd6f"} Oct 02 13:03:41 crc kubenswrapper[4724]: I1002 13:03:41.727014 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-t7kvp" event={"ID":"916861ea-15de-4a65-b053-4474fb141748","Type":"ContainerStarted","Data":"534c217f8e2dc6d0b1e308172ee901952f83ea1cbe0d5cea83b9a13dbdf082a0"} Oct 02 13:03:41 crc kubenswrapper[4724]: I1002 13:03:41.728988 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kz8p5" event={"ID":"bf1c3eef-48b3-4563-99fa-0a33d1c6835a","Type":"ContainerStarted","Data":"f0b77103e578efb53f6485b44a3bd1e2376140d5ea2ed73e92ce8d486f17bb65"} Oct 02 13:03:41 crc kubenswrapper[4724]: I1002 13:03:41.731004 4724 generic.go:334] "Generic (PLEG): container finished" podID="8ce22671-98d5-4e0e-9851-7da087e63499" containerID="d2e75e274098d8b2aa30ca240780d309e6b590ae6a371187a68004c1ccac5305" exitCode=0 Oct 02 13:03:41 crc kubenswrapper[4724]: I1002 13:03:41.731031 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qvzc9" event={"ID":"8ce22671-98d5-4e0e-9851-7da087e63499","Type":"ContainerDied","Data":"d2e75e274098d8b2aa30ca240780d309e6b590ae6a371187a68004c1ccac5305"} Oct 02 13:03:41 crc kubenswrapper[4724]: I1002 13:03:41.749180 4724 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-t7kvp" podStartSLOduration=1.913624108 podStartE2EDuration="3.749161413s" podCreationTimestamp="2025-10-02 13:03:38 +0000 UTC" firstStartedPulling="2025-10-02 13:03:39.700862404 +0000 UTC m=+284.155621525" lastFinishedPulling="2025-10-02 13:03:41.536399709 +0000 UTC m=+285.991158830" observedRunningTime="2025-10-02 13:03:41.746876415 +0000 UTC m=+286.201635536" watchObservedRunningTime="2025-10-02 13:03:41.749161413 +0000 UTC m=+286.203920534" Oct 02 13:03:42 crc kubenswrapper[4724]: I1002 13:03:42.737873 4724 generic.go:334] "Generic (PLEG): container finished" podID="5a6d4c8e-6d6b-4da3-b36c-7f8c4bd6c469" containerID="bdca43f48fa849e96c6c921760296ded4495b8b0a7e3b92296f8752925bb39be" exitCode=0 Oct 02 13:03:42 crc kubenswrapper[4724]: I1002 13:03:42.738055 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-fmg79" event={"ID":"5a6d4c8e-6d6b-4da3-b36c-7f8c4bd6c469","Type":"ContainerDied","Data":"bdca43f48fa849e96c6c921760296ded4495b8b0a7e3b92296f8752925bb39be"} Oct 02 13:03:42 crc kubenswrapper[4724]: I1002 13:03:42.743070 4724 generic.go:334] "Generic (PLEG): container finished" podID="bf1c3eef-48b3-4563-99fa-0a33d1c6835a" containerID="74ded895f8e1a4813fddc62d27b9cd01d0866a772a44d6daf0e7bf8b10de6e04" exitCode=0 Oct 02 13:03:42 crc kubenswrapper[4724]: I1002 13:03:42.743145 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kz8p5" event={"ID":"bf1c3eef-48b3-4563-99fa-0a33d1c6835a","Type":"ContainerDied","Data":"74ded895f8e1a4813fddc62d27b9cd01d0866a772a44d6daf0e7bf8b10de6e04"} Oct 02 13:03:42 crc kubenswrapper[4724]: I1002 13:03:42.748098 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qvzc9" event={"ID":"8ce22671-98d5-4e0e-9851-7da087e63499","Type":"ContainerStarted","Data":"d8775b14cae3f819cd3243ba5593afd2595fe02ebc12a7e0e1ead85887fad3e6"} Oct 02 13:03:42 crc kubenswrapper[4724]: I1002 13:03:42.801845 4724 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-qvzc9" podStartSLOduration=2.302079622 podStartE2EDuration="4.801824008s" podCreationTimestamp="2025-10-02 13:03:38 +0000 UTC" firstStartedPulling="2025-10-02 13:03:39.706718133 +0000 UTC m=+284.161477254" lastFinishedPulling="2025-10-02 13:03:42.206462519 +0000 UTC m=+286.661221640" observedRunningTime="2025-10-02 13:03:42.801755666 +0000 UTC m=+287.256514807" watchObservedRunningTime="2025-10-02 13:03:42.801824008 +0000 UTC m=+287.256583129" Oct 02 13:03:43 crc kubenswrapper[4724]: I1002 13:03:43.767939 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kz8p5" event={"ID":"bf1c3eef-48b3-4563-99fa-0a33d1c6835a","Type":"ContainerStarted","Data":"225127651f05cc8f7fc6d3122dcbe23d9357af9f955ffff6f5841d383cd57dd3"} Oct 02 13:03:44 crc kubenswrapper[4724]: I1002 13:03:44.774466 4724 generic.go:334] "Generic (PLEG): container finished" podID="bf1c3eef-48b3-4563-99fa-0a33d1c6835a" containerID="225127651f05cc8f7fc6d3122dcbe23d9357af9f955ffff6f5841d383cd57dd3" exitCode=0 Oct 02 13:03:44 crc kubenswrapper[4724]: I1002 13:03:44.774512 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kz8p5" event={"ID":"bf1c3eef-48b3-4563-99fa-0a33d1c6835a","Type":"ContainerDied","Data":"225127651f05cc8f7fc6d3122dcbe23d9357af9f955ffff6f5841d383cd57dd3"} Oct 02 13:03:44 crc kubenswrapper[4724]: I1002 13:03:44.776575 4724 generic.go:334] "Generic (PLEG): container finished" podID="5a6d4c8e-6d6b-4da3-b36c-7f8c4bd6c469" containerID="1665178aa477ce8662525052436efdef34c52b52e0b7c6c4db36a2be87f09b03" exitCode=0 Oct 02 13:03:44 crc kubenswrapper[4724]: I1002 13:03:44.776605 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-fmg79" event={"ID":"5a6d4c8e-6d6b-4da3-b36c-7f8c4bd6c469","Type":"ContainerDied","Data":"1665178aa477ce8662525052436efdef34c52b52e0b7c6c4db36a2be87f09b03"} Oct 02 13:03:45 crc kubenswrapper[4724]: I1002 13:03:45.783511 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-fmg79" event={"ID":"5a6d4c8e-6d6b-4da3-b36c-7f8c4bd6c469","Type":"ContainerStarted","Data":"938a316dc26e16ca974092c21d4f50083c308fbc505ed27be1bdc41a8458f6e2"} Oct 02 13:03:45 crc kubenswrapper[4724]: I1002 13:03:45.785774 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kz8p5" event={"ID":"bf1c3eef-48b3-4563-99fa-0a33d1c6835a","Type":"ContainerStarted","Data":"1dbff77ac13c21a0a13cc82c05d2e52eb01c11c40bd88d59942f9f8429c5e0fa"} Oct 02 13:03:45 crc kubenswrapper[4724]: I1002 13:03:45.807075 4724 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-fmg79" podStartSLOduration=3.119114352 podStartE2EDuration="5.807054987s" podCreationTimestamp="2025-10-02 13:03:40 +0000 UTC" firstStartedPulling="2025-10-02 13:03:42.740146409 +0000 UTC m=+287.194905530" lastFinishedPulling="2025-10-02 13:03:45.428087044 +0000 UTC m=+289.882846165" observedRunningTime="2025-10-02 13:03:45.803939028 +0000 UTC m=+290.258698169" watchObservedRunningTime="2025-10-02 13:03:45.807054987 +0000 UTC m=+290.261814108" Oct 02 13:03:45 crc kubenswrapper[4724]: I1002 13:03:45.818141 4724 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-kz8p5" podStartSLOduration=2.325178315 podStartE2EDuration="4.818122198s" podCreationTimestamp="2025-10-02 13:03:41 +0000 UTC" firstStartedPulling="2025-10-02 13:03:42.744550041 +0000 UTC m=+287.199309162" lastFinishedPulling="2025-10-02 13:03:45.237493924 +0000 UTC m=+289.692253045" observedRunningTime="2025-10-02 13:03:45.81739958 +0000 UTC m=+290.272158721" watchObservedRunningTime="2025-10-02 13:03:45.818122198 +0000 UTC m=+290.272881319" Oct 02 13:03:48 crc kubenswrapper[4724]: I1002 13:03:48.811621 4724 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-t7kvp" Oct 02 13:03:48 crc kubenswrapper[4724]: I1002 13:03:48.812118 4724 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-t7kvp" Oct 02 13:03:48 crc kubenswrapper[4724]: I1002 13:03:48.855089 4724 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-t7kvp" Oct 02 13:03:49 crc kubenswrapper[4724]: I1002 13:03:49.051897 4724 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-qvzc9" Oct 02 13:03:49 crc kubenswrapper[4724]: I1002 13:03:49.051939 4724 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-qvzc9" Oct 02 13:03:49 crc kubenswrapper[4724]: I1002 13:03:49.087858 4724 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-qvzc9" Oct 02 13:03:49 crc kubenswrapper[4724]: I1002 13:03:49.843756 4724 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-t7kvp" Oct 02 13:03:49 crc kubenswrapper[4724]: I1002 13:03:49.847397 4724 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-qvzc9" Oct 02 13:03:51 crc kubenswrapper[4724]: I1002 13:03:51.192964 4724 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-fmg79" Oct 02 13:03:51 crc kubenswrapper[4724]: I1002 13:03:51.193218 4724 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-fmg79" Oct 02 13:03:51 crc kubenswrapper[4724]: I1002 13:03:51.230673 4724 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-fmg79" Oct 02 13:03:51 crc kubenswrapper[4724]: I1002 13:03:51.382517 4724 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-kz8p5" Oct 02 13:03:51 crc kubenswrapper[4724]: I1002 13:03:51.382624 4724 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-kz8p5" Oct 02 13:03:51 crc kubenswrapper[4724]: I1002 13:03:51.418831 4724 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-kz8p5" Oct 02 13:03:51 crc kubenswrapper[4724]: I1002 13:03:51.857477 4724 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-fmg79" Oct 02 13:03:51 crc kubenswrapper[4724]: I1002 13:03:51.860575 4724 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-kz8p5" Oct 02 13:05:04 crc kubenswrapper[4724]: I1002 13:05:04.734767 4724 patch_prober.go:28] interesting pod/machine-config-daemon-74k4t container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 02 13:05:04 crc kubenswrapper[4724]: I1002 13:05:04.735378 4724 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-74k4t" podUID="f6090eaa-c182-4788-950c-16352c271233" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 02 13:05:34 crc kubenswrapper[4724]: I1002 13:05:34.735169 4724 patch_prober.go:28] interesting pod/machine-config-daemon-74k4t container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 02 13:05:34 crc kubenswrapper[4724]: I1002 13:05:34.735818 4724 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-74k4t" podUID="f6090eaa-c182-4788-950c-16352c271233" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 02 13:06:04 crc kubenswrapper[4724]: I1002 13:06:04.734846 4724 patch_prober.go:28] interesting pod/machine-config-daemon-74k4t container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 02 13:06:04 crc kubenswrapper[4724]: I1002 13:06:04.735427 4724 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-74k4t" podUID="f6090eaa-c182-4788-950c-16352c271233" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 02 13:06:04 crc kubenswrapper[4724]: I1002 13:06:04.735476 4724 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-74k4t" Oct 02 13:06:04 crc kubenswrapper[4724]: I1002 13:06:04.736134 4724 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"0e1e59e5ee7d2a3679bcc3637d0a4c2bf504931cc1e07c9c6a217c9a76b76895"} pod="openshift-machine-config-operator/machine-config-daemon-74k4t" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 02 13:06:04 crc kubenswrapper[4724]: I1002 13:06:04.736192 4724 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-74k4t" podUID="f6090eaa-c182-4788-950c-16352c271233" containerName="machine-config-daemon" containerID="cri-o://0e1e59e5ee7d2a3679bcc3637d0a4c2bf504931cc1e07c9c6a217c9a76b76895" gracePeriod=600 Oct 02 13:06:05 crc kubenswrapper[4724]: I1002 13:06:05.509393 4724 generic.go:334] "Generic (PLEG): container finished" podID="f6090eaa-c182-4788-950c-16352c271233" containerID="0e1e59e5ee7d2a3679bcc3637d0a4c2bf504931cc1e07c9c6a217c9a76b76895" exitCode=0 Oct 02 13:06:05 crc kubenswrapper[4724]: I1002 13:06:05.509501 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-74k4t" event={"ID":"f6090eaa-c182-4788-950c-16352c271233","Type":"ContainerDied","Data":"0e1e59e5ee7d2a3679bcc3637d0a4c2bf504931cc1e07c9c6a217c9a76b76895"} Oct 02 13:06:05 crc kubenswrapper[4724]: I1002 13:06:05.509786 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-74k4t" event={"ID":"f6090eaa-c182-4788-950c-16352c271233","Type":"ContainerStarted","Data":"d398edf87fc8194d9c33ea4b32752e88bb2fb64c569afeb27d5aad73c63d4d57"} Oct 02 13:06:05 crc kubenswrapper[4724]: I1002 13:06:05.509815 4724 scope.go:117] "RemoveContainer" containerID="4d5dbca156adf53ca2d3b237e15b6dd4387249e73dbf1a7fb3e7e39c53d1b548" Oct 02 13:06:25 crc kubenswrapper[4724]: I1002 13:06:25.024939 4724 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-6s9jv"] Oct 02 13:06:25 crc kubenswrapper[4724]: I1002 13:06:25.026365 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-6s9jv" Oct 02 13:06:25 crc kubenswrapper[4724]: I1002 13:06:25.082154 4724 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-6s9jv"] Oct 02 13:06:25 crc kubenswrapper[4724]: I1002 13:06:25.193805 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/6491d7ae-8bd9-4aba-8ff0-1639f9dc8b4d-registry-tls\") pod \"image-registry-66df7c8f76-6s9jv\" (UID: \"6491d7ae-8bd9-4aba-8ff0-1639f9dc8b4d\") " pod="openshift-image-registry/image-registry-66df7c8f76-6s9jv" Oct 02 13:06:25 crc kubenswrapper[4724]: I1002 13:06:25.193893 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/6491d7ae-8bd9-4aba-8ff0-1639f9dc8b4d-trusted-ca\") pod \"image-registry-66df7c8f76-6s9jv\" (UID: \"6491d7ae-8bd9-4aba-8ff0-1639f9dc8b4d\") " pod="openshift-image-registry/image-registry-66df7c8f76-6s9jv" Oct 02 13:06:25 crc kubenswrapper[4724]: I1002 13:06:25.193949 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/6491d7ae-8bd9-4aba-8ff0-1639f9dc8b4d-bound-sa-token\") pod \"image-registry-66df7c8f76-6s9jv\" (UID: \"6491d7ae-8bd9-4aba-8ff0-1639f9dc8b4d\") " pod="openshift-image-registry/image-registry-66df7c8f76-6s9jv" Oct 02 13:06:25 crc kubenswrapper[4724]: I1002 13:06:25.194006 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/6491d7ae-8bd9-4aba-8ff0-1639f9dc8b4d-installation-pull-secrets\") pod \"image-registry-66df7c8f76-6s9jv\" (UID: \"6491d7ae-8bd9-4aba-8ff0-1639f9dc8b4d\") " pod="openshift-image-registry/image-registry-66df7c8f76-6s9jv" Oct 02 13:06:25 crc kubenswrapper[4724]: I1002 13:06:25.194046 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/6491d7ae-8bd9-4aba-8ff0-1639f9dc8b4d-ca-trust-extracted\") pod \"image-registry-66df7c8f76-6s9jv\" (UID: \"6491d7ae-8bd9-4aba-8ff0-1639f9dc8b4d\") " pod="openshift-image-registry/image-registry-66df7c8f76-6s9jv" Oct 02 13:06:25 crc kubenswrapper[4724]: I1002 13:06:25.194074 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fxvf7\" (UniqueName: \"kubernetes.io/projected/6491d7ae-8bd9-4aba-8ff0-1639f9dc8b4d-kube-api-access-fxvf7\") pod \"image-registry-66df7c8f76-6s9jv\" (UID: \"6491d7ae-8bd9-4aba-8ff0-1639f9dc8b4d\") " pod="openshift-image-registry/image-registry-66df7c8f76-6s9jv" Oct 02 13:06:25 crc kubenswrapper[4724]: I1002 13:06:25.194126 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/6491d7ae-8bd9-4aba-8ff0-1639f9dc8b4d-registry-certificates\") pod \"image-registry-66df7c8f76-6s9jv\" (UID: \"6491d7ae-8bd9-4aba-8ff0-1639f9dc8b4d\") " pod="openshift-image-registry/image-registry-66df7c8f76-6s9jv" Oct 02 13:06:25 crc kubenswrapper[4724]: I1002 13:06:25.194151 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-6s9jv\" (UID: \"6491d7ae-8bd9-4aba-8ff0-1639f9dc8b4d\") " pod="openshift-image-registry/image-registry-66df7c8f76-6s9jv" Oct 02 13:06:25 crc kubenswrapper[4724]: I1002 13:06:25.215167 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-6s9jv\" (UID: \"6491d7ae-8bd9-4aba-8ff0-1639f9dc8b4d\") " pod="openshift-image-registry/image-registry-66df7c8f76-6s9jv" Oct 02 13:06:25 crc kubenswrapper[4724]: I1002 13:06:25.299497 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/6491d7ae-8bd9-4aba-8ff0-1639f9dc8b4d-bound-sa-token\") pod \"image-registry-66df7c8f76-6s9jv\" (UID: \"6491d7ae-8bd9-4aba-8ff0-1639f9dc8b4d\") " pod="openshift-image-registry/image-registry-66df7c8f76-6s9jv" Oct 02 13:06:25 crc kubenswrapper[4724]: I1002 13:06:25.299970 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/6491d7ae-8bd9-4aba-8ff0-1639f9dc8b4d-installation-pull-secrets\") pod \"image-registry-66df7c8f76-6s9jv\" (UID: \"6491d7ae-8bd9-4aba-8ff0-1639f9dc8b4d\") " pod="openshift-image-registry/image-registry-66df7c8f76-6s9jv" Oct 02 13:06:25 crc kubenswrapper[4724]: I1002 13:06:25.299999 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/6491d7ae-8bd9-4aba-8ff0-1639f9dc8b4d-ca-trust-extracted\") pod \"image-registry-66df7c8f76-6s9jv\" (UID: \"6491d7ae-8bd9-4aba-8ff0-1639f9dc8b4d\") " pod="openshift-image-registry/image-registry-66df7c8f76-6s9jv" Oct 02 13:06:25 crc kubenswrapper[4724]: I1002 13:06:25.300041 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fxvf7\" (UniqueName: \"kubernetes.io/projected/6491d7ae-8bd9-4aba-8ff0-1639f9dc8b4d-kube-api-access-fxvf7\") pod \"image-registry-66df7c8f76-6s9jv\" (UID: \"6491d7ae-8bd9-4aba-8ff0-1639f9dc8b4d\") " pod="openshift-image-registry/image-registry-66df7c8f76-6s9jv" Oct 02 13:06:25 crc kubenswrapper[4724]: I1002 13:06:25.300071 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/6491d7ae-8bd9-4aba-8ff0-1639f9dc8b4d-registry-certificates\") pod \"image-registry-66df7c8f76-6s9jv\" (UID: \"6491d7ae-8bd9-4aba-8ff0-1639f9dc8b4d\") " pod="openshift-image-registry/image-registry-66df7c8f76-6s9jv" Oct 02 13:06:25 crc kubenswrapper[4724]: I1002 13:06:25.300111 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/6491d7ae-8bd9-4aba-8ff0-1639f9dc8b4d-registry-tls\") pod \"image-registry-66df7c8f76-6s9jv\" (UID: \"6491d7ae-8bd9-4aba-8ff0-1639f9dc8b4d\") " pod="openshift-image-registry/image-registry-66df7c8f76-6s9jv" Oct 02 13:06:25 crc kubenswrapper[4724]: I1002 13:06:25.300135 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/6491d7ae-8bd9-4aba-8ff0-1639f9dc8b4d-trusted-ca\") pod \"image-registry-66df7c8f76-6s9jv\" (UID: \"6491d7ae-8bd9-4aba-8ff0-1639f9dc8b4d\") " pod="openshift-image-registry/image-registry-66df7c8f76-6s9jv" Oct 02 13:06:25 crc kubenswrapper[4724]: I1002 13:06:25.301426 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/6491d7ae-8bd9-4aba-8ff0-1639f9dc8b4d-trusted-ca\") pod \"image-registry-66df7c8f76-6s9jv\" (UID: \"6491d7ae-8bd9-4aba-8ff0-1639f9dc8b4d\") " pod="openshift-image-registry/image-registry-66df7c8f76-6s9jv" Oct 02 13:06:25 crc kubenswrapper[4724]: I1002 13:06:25.303253 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/6491d7ae-8bd9-4aba-8ff0-1639f9dc8b4d-ca-trust-extracted\") pod \"image-registry-66df7c8f76-6s9jv\" (UID: \"6491d7ae-8bd9-4aba-8ff0-1639f9dc8b4d\") " pod="openshift-image-registry/image-registry-66df7c8f76-6s9jv" Oct 02 13:06:25 crc kubenswrapper[4724]: I1002 13:06:25.303760 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/6491d7ae-8bd9-4aba-8ff0-1639f9dc8b4d-registry-certificates\") pod \"image-registry-66df7c8f76-6s9jv\" (UID: \"6491d7ae-8bd9-4aba-8ff0-1639f9dc8b4d\") " pod="openshift-image-registry/image-registry-66df7c8f76-6s9jv" Oct 02 13:06:25 crc kubenswrapper[4724]: I1002 13:06:25.312677 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/6491d7ae-8bd9-4aba-8ff0-1639f9dc8b4d-registry-tls\") pod \"image-registry-66df7c8f76-6s9jv\" (UID: \"6491d7ae-8bd9-4aba-8ff0-1639f9dc8b4d\") " pod="openshift-image-registry/image-registry-66df7c8f76-6s9jv" Oct 02 13:06:25 crc kubenswrapper[4724]: I1002 13:06:25.319237 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/6491d7ae-8bd9-4aba-8ff0-1639f9dc8b4d-installation-pull-secrets\") pod \"image-registry-66df7c8f76-6s9jv\" (UID: \"6491d7ae-8bd9-4aba-8ff0-1639f9dc8b4d\") " pod="openshift-image-registry/image-registry-66df7c8f76-6s9jv" Oct 02 13:06:25 crc kubenswrapper[4724]: I1002 13:06:25.330458 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/6491d7ae-8bd9-4aba-8ff0-1639f9dc8b4d-bound-sa-token\") pod \"image-registry-66df7c8f76-6s9jv\" (UID: \"6491d7ae-8bd9-4aba-8ff0-1639f9dc8b4d\") " pod="openshift-image-registry/image-registry-66df7c8f76-6s9jv" Oct 02 13:06:25 crc kubenswrapper[4724]: I1002 13:06:25.337503 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fxvf7\" (UniqueName: \"kubernetes.io/projected/6491d7ae-8bd9-4aba-8ff0-1639f9dc8b4d-kube-api-access-fxvf7\") pod \"image-registry-66df7c8f76-6s9jv\" (UID: \"6491d7ae-8bd9-4aba-8ff0-1639f9dc8b4d\") " pod="openshift-image-registry/image-registry-66df7c8f76-6s9jv" Oct 02 13:06:25 crc kubenswrapper[4724]: I1002 13:06:25.342059 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-6s9jv" Oct 02 13:06:25 crc kubenswrapper[4724]: I1002 13:06:25.532151 4724 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-6s9jv"] Oct 02 13:06:25 crc kubenswrapper[4724]: W1002 13:06:25.537771 4724 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6491d7ae_8bd9_4aba_8ff0_1639f9dc8b4d.slice/crio-a4a93d4b91ad8585dd19e181a0d85ac08c0b17a876f4af8e85a8fb35195ae614 WatchSource:0}: Error finding container a4a93d4b91ad8585dd19e181a0d85ac08c0b17a876f4af8e85a8fb35195ae614: Status 404 returned error can't find the container with id a4a93d4b91ad8585dd19e181a0d85ac08c0b17a876f4af8e85a8fb35195ae614 Oct 02 13:06:25 crc kubenswrapper[4724]: I1002 13:06:25.631170 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-6s9jv" event={"ID":"6491d7ae-8bd9-4aba-8ff0-1639f9dc8b4d","Type":"ContainerStarted","Data":"a4a93d4b91ad8585dd19e181a0d85ac08c0b17a876f4af8e85a8fb35195ae614"} Oct 02 13:06:26 crc kubenswrapper[4724]: I1002 13:06:26.638279 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-6s9jv" event={"ID":"6491d7ae-8bd9-4aba-8ff0-1639f9dc8b4d","Type":"ContainerStarted","Data":"a5f66b294ce8a6148fc786c5690df38e8430fcf03f9899e26ba368f394192f3d"} Oct 02 13:06:26 crc kubenswrapper[4724]: I1002 13:06:26.638588 4724 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-66df7c8f76-6s9jv" Oct 02 13:06:26 crc kubenswrapper[4724]: I1002 13:06:26.656708 4724 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-66df7c8f76-6s9jv" podStartSLOduration=1.656678108 podStartE2EDuration="1.656678108s" podCreationTimestamp="2025-10-02 13:06:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 13:06:26.654182895 +0000 UTC m=+451.108942036" watchObservedRunningTime="2025-10-02 13:06:26.656678108 +0000 UTC m=+451.111437269" Oct 02 13:06:45 crc kubenswrapper[4724]: I1002 13:06:45.351295 4724 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-66df7c8f76-6s9jv" Oct 02 13:06:45 crc kubenswrapper[4724]: I1002 13:06:45.422669 4724 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-v5w5d"] Oct 02 13:07:10 crc kubenswrapper[4724]: I1002 13:07:10.466298 4724 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-image-registry/image-registry-697d97f7c8-v5w5d" podUID="7de6408f-76b4-4a9a-bf83-fe6c4e60848e" containerName="registry" containerID="cri-o://85a33b0ddf97fe066a65853af2bda8fa37d284b8b13340b4f068a7fe1e48127c" gracePeriod=30 Oct 02 13:07:10 crc kubenswrapper[4724]: I1002 13:07:10.824236 4724 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-v5w5d" Oct 02 13:07:10 crc kubenswrapper[4724]: I1002 13:07:10.878095 4724 generic.go:334] "Generic (PLEG): container finished" podID="7de6408f-76b4-4a9a-bf83-fe6c4e60848e" containerID="85a33b0ddf97fe066a65853af2bda8fa37d284b8b13340b4f068a7fe1e48127c" exitCode=0 Oct 02 13:07:10 crc kubenswrapper[4724]: I1002 13:07:10.878142 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-v5w5d" event={"ID":"7de6408f-76b4-4a9a-bf83-fe6c4e60848e","Type":"ContainerDied","Data":"85a33b0ddf97fe066a65853af2bda8fa37d284b8b13340b4f068a7fe1e48127c"} Oct 02 13:07:10 crc kubenswrapper[4724]: I1002 13:07:10.878174 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-v5w5d" event={"ID":"7de6408f-76b4-4a9a-bf83-fe6c4e60848e","Type":"ContainerDied","Data":"2642d328cc74da2067fa8fbdb4e7f4511fbf939d66829b3fc4488c951f2c0fd5"} Oct 02 13:07:10 crc kubenswrapper[4724]: I1002 13:07:10.878201 4724 scope.go:117] "RemoveContainer" containerID="85a33b0ddf97fe066a65853af2bda8fa37d284b8b13340b4f068a7fe1e48127c" Oct 02 13:07:10 crc kubenswrapper[4724]: I1002 13:07:10.878215 4724 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-v5w5d" Oct 02 13:07:10 crc kubenswrapper[4724]: I1002 13:07:10.896503 4724 scope.go:117] "RemoveContainer" containerID="85a33b0ddf97fe066a65853af2bda8fa37d284b8b13340b4f068a7fe1e48127c" Oct 02 13:07:10 crc kubenswrapper[4724]: E1002 13:07:10.897395 4724 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"85a33b0ddf97fe066a65853af2bda8fa37d284b8b13340b4f068a7fe1e48127c\": container with ID starting with 85a33b0ddf97fe066a65853af2bda8fa37d284b8b13340b4f068a7fe1e48127c not found: ID does not exist" containerID="85a33b0ddf97fe066a65853af2bda8fa37d284b8b13340b4f068a7fe1e48127c" Oct 02 13:07:10 crc kubenswrapper[4724]: I1002 13:07:10.897462 4724 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"85a33b0ddf97fe066a65853af2bda8fa37d284b8b13340b4f068a7fe1e48127c"} err="failed to get container status \"85a33b0ddf97fe066a65853af2bda8fa37d284b8b13340b4f068a7fe1e48127c\": rpc error: code = NotFound desc = could not find container \"85a33b0ddf97fe066a65853af2bda8fa37d284b8b13340b4f068a7fe1e48127c\": container with ID starting with 85a33b0ddf97fe066a65853af2bda8fa37d284b8b13340b4f068a7fe1e48127c not found: ID does not exist" Oct 02 13:07:10 crc kubenswrapper[4724]: I1002 13:07:10.965624 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/7de6408f-76b4-4a9a-bf83-fe6c4e60848e-registry-certificates\") pod \"7de6408f-76b4-4a9a-bf83-fe6c4e60848e\" (UID: \"7de6408f-76b4-4a9a-bf83-fe6c4e60848e\") " Oct 02 13:07:10 crc kubenswrapper[4724]: I1002 13:07:10.965673 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/7de6408f-76b4-4a9a-bf83-fe6c4e60848e-registry-tls\") pod \"7de6408f-76b4-4a9a-bf83-fe6c4e60848e\" (UID: \"7de6408f-76b4-4a9a-bf83-fe6c4e60848e\") " Oct 02 13:07:10 crc kubenswrapper[4724]: I1002 13:07:10.965717 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/7de6408f-76b4-4a9a-bf83-fe6c4e60848e-trusted-ca\") pod \"7de6408f-76b4-4a9a-bf83-fe6c4e60848e\" (UID: \"7de6408f-76b4-4a9a-bf83-fe6c4e60848e\") " Oct 02 13:07:10 crc kubenswrapper[4724]: I1002 13:07:10.965748 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/7de6408f-76b4-4a9a-bf83-fe6c4e60848e-installation-pull-secrets\") pod \"7de6408f-76b4-4a9a-bf83-fe6c4e60848e\" (UID: \"7de6408f-76b4-4a9a-bf83-fe6c4e60848e\") " Oct 02 13:07:10 crc kubenswrapper[4724]: I1002 13:07:10.965915 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-storage\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"7de6408f-76b4-4a9a-bf83-fe6c4e60848e\" (UID: \"7de6408f-76b4-4a9a-bf83-fe6c4e60848e\") " Oct 02 13:07:10 crc kubenswrapper[4724]: I1002 13:07:10.965934 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bm2h7\" (UniqueName: \"kubernetes.io/projected/7de6408f-76b4-4a9a-bf83-fe6c4e60848e-kube-api-access-bm2h7\") pod \"7de6408f-76b4-4a9a-bf83-fe6c4e60848e\" (UID: \"7de6408f-76b4-4a9a-bf83-fe6c4e60848e\") " Oct 02 13:07:10 crc kubenswrapper[4724]: I1002 13:07:10.965995 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/7de6408f-76b4-4a9a-bf83-fe6c4e60848e-bound-sa-token\") pod \"7de6408f-76b4-4a9a-bf83-fe6c4e60848e\" (UID: \"7de6408f-76b4-4a9a-bf83-fe6c4e60848e\") " Oct 02 13:07:10 crc kubenswrapper[4724]: I1002 13:07:10.966036 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/7de6408f-76b4-4a9a-bf83-fe6c4e60848e-ca-trust-extracted\") pod \"7de6408f-76b4-4a9a-bf83-fe6c4e60848e\" (UID: \"7de6408f-76b4-4a9a-bf83-fe6c4e60848e\") " Oct 02 13:07:10 crc kubenswrapper[4724]: I1002 13:07:10.966886 4724 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7de6408f-76b4-4a9a-bf83-fe6c4e60848e-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "7de6408f-76b4-4a9a-bf83-fe6c4e60848e" (UID: "7de6408f-76b4-4a9a-bf83-fe6c4e60848e"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 13:07:10 crc kubenswrapper[4724]: I1002 13:07:10.966904 4724 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7de6408f-76b4-4a9a-bf83-fe6c4e60848e-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "7de6408f-76b4-4a9a-bf83-fe6c4e60848e" (UID: "7de6408f-76b4-4a9a-bf83-fe6c4e60848e"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 13:07:10 crc kubenswrapper[4724]: I1002 13:07:10.973676 4724 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7de6408f-76b4-4a9a-bf83-fe6c4e60848e-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "7de6408f-76b4-4a9a-bf83-fe6c4e60848e" (UID: "7de6408f-76b4-4a9a-bf83-fe6c4e60848e"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 13:07:10 crc kubenswrapper[4724]: I1002 13:07:10.974085 4724 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7de6408f-76b4-4a9a-bf83-fe6c4e60848e-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "7de6408f-76b4-4a9a-bf83-fe6c4e60848e" (UID: "7de6408f-76b4-4a9a-bf83-fe6c4e60848e"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 13:07:10 crc kubenswrapper[4724]: I1002 13:07:10.974468 4724 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7de6408f-76b4-4a9a-bf83-fe6c4e60848e-kube-api-access-bm2h7" (OuterVolumeSpecName: "kube-api-access-bm2h7") pod "7de6408f-76b4-4a9a-bf83-fe6c4e60848e" (UID: "7de6408f-76b4-4a9a-bf83-fe6c4e60848e"). InnerVolumeSpecName "kube-api-access-bm2h7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 13:07:10 crc kubenswrapper[4724]: I1002 13:07:10.979123 4724 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7de6408f-76b4-4a9a-bf83-fe6c4e60848e-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "7de6408f-76b4-4a9a-bf83-fe6c4e60848e" (UID: "7de6408f-76b4-4a9a-bf83-fe6c4e60848e"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 13:07:10 crc kubenswrapper[4724]: I1002 13:07:10.979689 4724 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "registry-storage") pod "7de6408f-76b4-4a9a-bf83-fe6c4e60848e" (UID: "7de6408f-76b4-4a9a-bf83-fe6c4e60848e"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Oct 02 13:07:10 crc kubenswrapper[4724]: I1002 13:07:10.990624 4724 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7de6408f-76b4-4a9a-bf83-fe6c4e60848e-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "7de6408f-76b4-4a9a-bf83-fe6c4e60848e" (UID: "7de6408f-76b4-4a9a-bf83-fe6c4e60848e"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 13:07:11 crc kubenswrapper[4724]: I1002 13:07:11.066983 4724 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bm2h7\" (UniqueName: \"kubernetes.io/projected/7de6408f-76b4-4a9a-bf83-fe6c4e60848e-kube-api-access-bm2h7\") on node \"crc\" DevicePath \"\"" Oct 02 13:07:11 crc kubenswrapper[4724]: I1002 13:07:11.067228 4724 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/7de6408f-76b4-4a9a-bf83-fe6c4e60848e-bound-sa-token\") on node \"crc\" DevicePath \"\"" Oct 02 13:07:11 crc kubenswrapper[4724]: I1002 13:07:11.067287 4724 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/7de6408f-76b4-4a9a-bf83-fe6c4e60848e-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Oct 02 13:07:11 crc kubenswrapper[4724]: I1002 13:07:11.067374 4724 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/7de6408f-76b4-4a9a-bf83-fe6c4e60848e-registry-certificates\") on node \"crc\" DevicePath \"\"" Oct 02 13:07:11 crc kubenswrapper[4724]: I1002 13:07:11.067435 4724 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/7de6408f-76b4-4a9a-bf83-fe6c4e60848e-registry-tls\") on node \"crc\" DevicePath \"\"" Oct 02 13:07:11 crc kubenswrapper[4724]: I1002 13:07:11.067487 4724 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/7de6408f-76b4-4a9a-bf83-fe6c4e60848e-trusted-ca\") on node \"crc\" DevicePath \"\"" Oct 02 13:07:11 crc kubenswrapper[4724]: I1002 13:07:11.067548 4724 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/7de6408f-76b4-4a9a-bf83-fe6c4e60848e-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Oct 02 13:07:11 crc kubenswrapper[4724]: I1002 13:07:11.214453 4724 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-v5w5d"] Oct 02 13:07:11 crc kubenswrapper[4724]: I1002 13:07:11.220703 4724 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-v5w5d"] Oct 02 13:07:12 crc kubenswrapper[4724]: I1002 13:07:12.322645 4724 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7de6408f-76b4-4a9a-bf83-fe6c4e60848e" path="/var/lib/kubelet/pods/7de6408f-76b4-4a9a-bf83-fe6c4e60848e/volumes" Oct 02 13:08:34 crc kubenswrapper[4724]: I1002 13:08:34.735019 4724 patch_prober.go:28] interesting pod/machine-config-daemon-74k4t container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 02 13:08:34 crc kubenswrapper[4724]: I1002 13:08:34.735707 4724 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-74k4t" podUID="f6090eaa-c182-4788-950c-16352c271233" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 02 13:09:04 crc kubenswrapper[4724]: I1002 13:09:04.735000 4724 patch_prober.go:28] interesting pod/machine-config-daemon-74k4t container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 02 13:09:04 crc kubenswrapper[4724]: I1002 13:09:04.735986 4724 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-74k4t" podUID="f6090eaa-c182-4788-950c-16352c271233" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 02 13:09:29 crc kubenswrapper[4724]: I1002 13:09:29.466810 4724 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-w58lt"] Oct 02 13:09:29 crc kubenswrapper[4724]: I1002 13:09:29.467744 4724 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-w58lt" podUID="4089ad23-969c-4222-a8ed-e141ec291e80" containerName="ovn-controller" containerID="cri-o://4186fbd2d282edbe581a4d8d3616dff86c3e6cdbe797999fa57f6034870e7742" gracePeriod=30 Oct 02 13:09:29 crc kubenswrapper[4724]: I1002 13:09:29.467871 4724 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-w58lt" podUID="4089ad23-969c-4222-a8ed-e141ec291e80" containerName="kube-rbac-proxy-node" containerID="cri-o://197d101ac60dc4e22bd91f0bccaf0c5d9461d2bb94425b5c6d4ca75aad0f7e25" gracePeriod=30 Oct 02 13:09:29 crc kubenswrapper[4724]: I1002 13:09:29.467960 4724 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-w58lt" podUID="4089ad23-969c-4222-a8ed-e141ec291e80" containerName="ovn-acl-logging" containerID="cri-o://1f4412d5281a9c03ea5b251d17371e363b9c5a970bad0db12adf3f75ddd1f955" gracePeriod=30 Oct 02 13:09:29 crc kubenswrapper[4724]: I1002 13:09:29.467951 4724 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-w58lt" podUID="4089ad23-969c-4222-a8ed-e141ec291e80" containerName="kube-rbac-proxy-ovn-metrics" containerID="cri-o://b3940d1762b165816101228f10689c4897ca3909b295c51368935e2d28d32e9c" gracePeriod=30 Oct 02 13:09:29 crc kubenswrapper[4724]: I1002 13:09:29.467971 4724 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-w58lt" podUID="4089ad23-969c-4222-a8ed-e141ec291e80" containerName="sbdb" containerID="cri-o://3c6cd68ea9e76b6dcb1649fafb808953b30389fcaee674b611c655c239a36e7f" gracePeriod=30 Oct 02 13:09:29 crc kubenswrapper[4724]: I1002 13:09:29.467974 4724 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-w58lt" podUID="4089ad23-969c-4222-a8ed-e141ec291e80" containerName="nbdb" containerID="cri-o://300ea92580dcf51b685b378397b0dd067899d424f4394bce09a8242415acba64" gracePeriod=30 Oct 02 13:09:29 crc kubenswrapper[4724]: I1002 13:09:29.468372 4724 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-w58lt" podUID="4089ad23-969c-4222-a8ed-e141ec291e80" containerName="northd" containerID="cri-o://6bc33e00e91c1177fe2762cb5295c78f4b00c1861d44446269e2e7668b25ac83" gracePeriod=30 Oct 02 13:09:29 crc kubenswrapper[4724]: I1002 13:09:29.518274 4724 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-w58lt" podUID="4089ad23-969c-4222-a8ed-e141ec291e80" containerName="ovnkube-controller" containerID="cri-o://8e820afb07a66f1eb4b72ab3a92ba9dbd09e42cf1a2994c9dbca241a450e51fb" gracePeriod=30 Oct 02 13:09:29 crc kubenswrapper[4724]: I1002 13:09:29.659921 4724 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-pr276_c67e3c27-01fd-47f5-a7fb-a2ae1e7753d4/kube-multus/2.log" Oct 02 13:09:29 crc kubenswrapper[4724]: I1002 13:09:29.660725 4724 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-pr276_c67e3c27-01fd-47f5-a7fb-a2ae1e7753d4/kube-multus/1.log" Oct 02 13:09:29 crc kubenswrapper[4724]: I1002 13:09:29.660779 4724 generic.go:334] "Generic (PLEG): container finished" podID="c67e3c27-01fd-47f5-a7fb-a2ae1e7753d4" containerID="14f9a64b6c087079ffb4c1374976c3a597724a7ba274f00574e30df84ed84076" exitCode=2 Oct 02 13:09:29 crc kubenswrapper[4724]: I1002 13:09:29.660857 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-pr276" event={"ID":"c67e3c27-01fd-47f5-a7fb-a2ae1e7753d4","Type":"ContainerDied","Data":"14f9a64b6c087079ffb4c1374976c3a597724a7ba274f00574e30df84ed84076"} Oct 02 13:09:29 crc kubenswrapper[4724]: I1002 13:09:29.660910 4724 scope.go:117] "RemoveContainer" containerID="f6c546eb3faa335d37ee1c08a88b2d409a5ee23ed42ca9fd7104fb0bb76ecf15" Oct 02 13:09:29 crc kubenswrapper[4724]: I1002 13:09:29.661788 4724 scope.go:117] "RemoveContainer" containerID="14f9a64b6c087079ffb4c1374976c3a597724a7ba274f00574e30df84ed84076" Oct 02 13:09:29 crc kubenswrapper[4724]: E1002 13:09:29.662208 4724 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-multus pod=multus-pr276_openshift-multus(c67e3c27-01fd-47f5-a7fb-a2ae1e7753d4)\"" pod="openshift-multus/multus-pr276" podUID="c67e3c27-01fd-47f5-a7fb-a2ae1e7753d4" Oct 02 13:09:29 crc kubenswrapper[4724]: I1002 13:09:29.665391 4724 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-w58lt_4089ad23-969c-4222-a8ed-e141ec291e80/ovnkube-controller/3.log" Oct 02 13:09:29 crc kubenswrapper[4724]: I1002 13:09:29.691069 4724 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-w58lt_4089ad23-969c-4222-a8ed-e141ec291e80/ovn-acl-logging/0.log" Oct 02 13:09:29 crc kubenswrapper[4724]: I1002 13:09:29.692158 4724 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-w58lt_4089ad23-969c-4222-a8ed-e141ec291e80/ovn-controller/0.log" Oct 02 13:09:29 crc kubenswrapper[4724]: I1002 13:09:29.692817 4724 generic.go:334] "Generic (PLEG): container finished" podID="4089ad23-969c-4222-a8ed-e141ec291e80" containerID="8e820afb07a66f1eb4b72ab3a92ba9dbd09e42cf1a2994c9dbca241a450e51fb" exitCode=0 Oct 02 13:09:29 crc kubenswrapper[4724]: I1002 13:09:29.692849 4724 generic.go:334] "Generic (PLEG): container finished" podID="4089ad23-969c-4222-a8ed-e141ec291e80" containerID="b3940d1762b165816101228f10689c4897ca3909b295c51368935e2d28d32e9c" exitCode=0 Oct 02 13:09:29 crc kubenswrapper[4724]: I1002 13:09:29.692857 4724 generic.go:334] "Generic (PLEG): container finished" podID="4089ad23-969c-4222-a8ed-e141ec291e80" containerID="197d101ac60dc4e22bd91f0bccaf0c5d9461d2bb94425b5c6d4ca75aad0f7e25" exitCode=0 Oct 02 13:09:29 crc kubenswrapper[4724]: I1002 13:09:29.692864 4724 generic.go:334] "Generic (PLEG): container finished" podID="4089ad23-969c-4222-a8ed-e141ec291e80" containerID="1f4412d5281a9c03ea5b251d17371e363b9c5a970bad0db12adf3f75ddd1f955" exitCode=143 Oct 02 13:09:29 crc kubenswrapper[4724]: I1002 13:09:29.692891 4724 generic.go:334] "Generic (PLEG): container finished" podID="4089ad23-969c-4222-a8ed-e141ec291e80" containerID="4186fbd2d282edbe581a4d8d3616dff86c3e6cdbe797999fa57f6034870e7742" exitCode=143 Oct 02 13:09:29 crc kubenswrapper[4724]: I1002 13:09:29.692903 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-w58lt" event={"ID":"4089ad23-969c-4222-a8ed-e141ec291e80","Type":"ContainerDied","Data":"8e820afb07a66f1eb4b72ab3a92ba9dbd09e42cf1a2994c9dbca241a450e51fb"} Oct 02 13:09:29 crc kubenswrapper[4724]: I1002 13:09:29.692950 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-w58lt" event={"ID":"4089ad23-969c-4222-a8ed-e141ec291e80","Type":"ContainerDied","Data":"b3940d1762b165816101228f10689c4897ca3909b295c51368935e2d28d32e9c"} Oct 02 13:09:29 crc kubenswrapper[4724]: I1002 13:09:29.692962 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-w58lt" event={"ID":"4089ad23-969c-4222-a8ed-e141ec291e80","Type":"ContainerDied","Data":"197d101ac60dc4e22bd91f0bccaf0c5d9461d2bb94425b5c6d4ca75aad0f7e25"} Oct 02 13:09:29 crc kubenswrapper[4724]: I1002 13:09:29.692972 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-w58lt" event={"ID":"4089ad23-969c-4222-a8ed-e141ec291e80","Type":"ContainerDied","Data":"1f4412d5281a9c03ea5b251d17371e363b9c5a970bad0db12adf3f75ddd1f955"} Oct 02 13:09:29 crc kubenswrapper[4724]: I1002 13:09:29.692981 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-w58lt" event={"ID":"4089ad23-969c-4222-a8ed-e141ec291e80","Type":"ContainerDied","Data":"4186fbd2d282edbe581a4d8d3616dff86c3e6cdbe797999fa57f6034870e7742"} Oct 02 13:09:29 crc kubenswrapper[4724]: I1002 13:09:29.778075 4724 scope.go:117] "RemoveContainer" containerID="6c6b3b7fc0c4fc12e9d202da41be4c0a934508c22e2f4d81029e5e29ac0073f1" Oct 02 13:09:29 crc kubenswrapper[4724]: I1002 13:09:29.806307 4724 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-w58lt_4089ad23-969c-4222-a8ed-e141ec291e80/ovn-acl-logging/0.log" Oct 02 13:09:29 crc kubenswrapper[4724]: I1002 13:09:29.806757 4724 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-w58lt_4089ad23-969c-4222-a8ed-e141ec291e80/ovn-controller/0.log" Oct 02 13:09:29 crc kubenswrapper[4724]: I1002 13:09:29.807137 4724 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-w58lt" Oct 02 13:09:29 crc kubenswrapper[4724]: I1002 13:09:29.865010 4724 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-jvbkw"] Oct 02 13:09:29 crc kubenswrapper[4724]: E1002 13:09:29.865370 4724 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4089ad23-969c-4222-a8ed-e141ec291e80" containerName="ovnkube-controller" Oct 02 13:09:29 crc kubenswrapper[4724]: I1002 13:09:29.865401 4724 state_mem.go:107] "Deleted CPUSet assignment" podUID="4089ad23-969c-4222-a8ed-e141ec291e80" containerName="ovnkube-controller" Oct 02 13:09:29 crc kubenswrapper[4724]: E1002 13:09:29.865418 4724 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4089ad23-969c-4222-a8ed-e141ec291e80" containerName="kubecfg-setup" Oct 02 13:09:29 crc kubenswrapper[4724]: I1002 13:09:29.865432 4724 state_mem.go:107] "Deleted CPUSet assignment" podUID="4089ad23-969c-4222-a8ed-e141ec291e80" containerName="kubecfg-setup" Oct 02 13:09:29 crc kubenswrapper[4724]: E1002 13:09:29.865443 4724 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4089ad23-969c-4222-a8ed-e141ec291e80" containerName="ovnkube-controller" Oct 02 13:09:29 crc kubenswrapper[4724]: I1002 13:09:29.865453 4724 state_mem.go:107] "Deleted CPUSet assignment" podUID="4089ad23-969c-4222-a8ed-e141ec291e80" containerName="ovnkube-controller" Oct 02 13:09:29 crc kubenswrapper[4724]: E1002 13:09:29.865466 4724 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4089ad23-969c-4222-a8ed-e141ec291e80" containerName="sbdb" Oct 02 13:09:29 crc kubenswrapper[4724]: I1002 13:09:29.865474 4724 state_mem.go:107] "Deleted CPUSet assignment" podUID="4089ad23-969c-4222-a8ed-e141ec291e80" containerName="sbdb" Oct 02 13:09:29 crc kubenswrapper[4724]: E1002 13:09:29.865486 4724 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4089ad23-969c-4222-a8ed-e141ec291e80" containerName="nbdb" Oct 02 13:09:29 crc kubenswrapper[4724]: I1002 13:09:29.865495 4724 state_mem.go:107] "Deleted CPUSet assignment" podUID="4089ad23-969c-4222-a8ed-e141ec291e80" containerName="nbdb" Oct 02 13:09:29 crc kubenswrapper[4724]: E1002 13:09:29.865511 4724 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4089ad23-969c-4222-a8ed-e141ec291e80" containerName="ovn-acl-logging" Oct 02 13:09:29 crc kubenswrapper[4724]: I1002 13:09:29.865520 4724 state_mem.go:107] "Deleted CPUSet assignment" podUID="4089ad23-969c-4222-a8ed-e141ec291e80" containerName="ovn-acl-logging" Oct 02 13:09:29 crc kubenswrapper[4724]: E1002 13:09:29.865535 4724 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4089ad23-969c-4222-a8ed-e141ec291e80" containerName="kube-rbac-proxy-node" Oct 02 13:09:29 crc kubenswrapper[4724]: I1002 13:09:29.865544 4724 state_mem.go:107] "Deleted CPUSet assignment" podUID="4089ad23-969c-4222-a8ed-e141ec291e80" containerName="kube-rbac-proxy-node" Oct 02 13:09:29 crc kubenswrapper[4724]: E1002 13:09:29.865577 4724 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4089ad23-969c-4222-a8ed-e141ec291e80" containerName="kube-rbac-proxy-ovn-metrics" Oct 02 13:09:29 crc kubenswrapper[4724]: I1002 13:09:29.865586 4724 state_mem.go:107] "Deleted CPUSet assignment" podUID="4089ad23-969c-4222-a8ed-e141ec291e80" containerName="kube-rbac-proxy-ovn-metrics" Oct 02 13:09:29 crc kubenswrapper[4724]: E1002 13:09:29.865597 4724 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7de6408f-76b4-4a9a-bf83-fe6c4e60848e" containerName="registry" Oct 02 13:09:29 crc kubenswrapper[4724]: I1002 13:09:29.865606 4724 state_mem.go:107] "Deleted CPUSet assignment" podUID="7de6408f-76b4-4a9a-bf83-fe6c4e60848e" containerName="registry" Oct 02 13:09:29 crc kubenswrapper[4724]: E1002 13:09:29.865622 4724 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4089ad23-969c-4222-a8ed-e141ec291e80" containerName="northd" Oct 02 13:09:29 crc kubenswrapper[4724]: I1002 13:09:29.865631 4724 state_mem.go:107] "Deleted CPUSet assignment" podUID="4089ad23-969c-4222-a8ed-e141ec291e80" containerName="northd" Oct 02 13:09:29 crc kubenswrapper[4724]: E1002 13:09:29.865645 4724 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4089ad23-969c-4222-a8ed-e141ec291e80" containerName="ovn-controller" Oct 02 13:09:29 crc kubenswrapper[4724]: I1002 13:09:29.865653 4724 state_mem.go:107] "Deleted CPUSet assignment" podUID="4089ad23-969c-4222-a8ed-e141ec291e80" containerName="ovn-controller" Oct 02 13:09:29 crc kubenswrapper[4724]: E1002 13:09:29.865666 4724 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4089ad23-969c-4222-a8ed-e141ec291e80" containerName="ovnkube-controller" Oct 02 13:09:29 crc kubenswrapper[4724]: I1002 13:09:29.865675 4724 state_mem.go:107] "Deleted CPUSet assignment" podUID="4089ad23-969c-4222-a8ed-e141ec291e80" containerName="ovnkube-controller" Oct 02 13:09:29 crc kubenswrapper[4724]: I1002 13:09:29.865797 4724 memory_manager.go:354] "RemoveStaleState removing state" podUID="4089ad23-969c-4222-a8ed-e141ec291e80" containerName="kube-rbac-proxy-ovn-metrics" Oct 02 13:09:29 crc kubenswrapper[4724]: I1002 13:09:29.865810 4724 memory_manager.go:354] "RemoveStaleState removing state" podUID="4089ad23-969c-4222-a8ed-e141ec291e80" containerName="ovn-acl-logging" Oct 02 13:09:29 crc kubenswrapper[4724]: I1002 13:09:29.865823 4724 memory_manager.go:354] "RemoveStaleState removing state" podUID="4089ad23-969c-4222-a8ed-e141ec291e80" containerName="ovnkube-controller" Oct 02 13:09:29 crc kubenswrapper[4724]: I1002 13:09:29.865834 4724 memory_manager.go:354] "RemoveStaleState removing state" podUID="4089ad23-969c-4222-a8ed-e141ec291e80" containerName="ovnkube-controller" Oct 02 13:09:29 crc kubenswrapper[4724]: I1002 13:09:29.865849 4724 memory_manager.go:354] "RemoveStaleState removing state" podUID="4089ad23-969c-4222-a8ed-e141ec291e80" containerName="ovnkube-controller" Oct 02 13:09:29 crc kubenswrapper[4724]: I1002 13:09:29.865861 4724 memory_manager.go:354] "RemoveStaleState removing state" podUID="7de6408f-76b4-4a9a-bf83-fe6c4e60848e" containerName="registry" Oct 02 13:09:29 crc kubenswrapper[4724]: I1002 13:09:29.865872 4724 memory_manager.go:354] "RemoveStaleState removing state" podUID="4089ad23-969c-4222-a8ed-e141ec291e80" containerName="sbdb" Oct 02 13:09:29 crc kubenswrapper[4724]: I1002 13:09:29.865884 4724 memory_manager.go:354] "RemoveStaleState removing state" podUID="4089ad23-969c-4222-a8ed-e141ec291e80" containerName="nbdb" Oct 02 13:09:29 crc kubenswrapper[4724]: I1002 13:09:29.865897 4724 memory_manager.go:354] "RemoveStaleState removing state" podUID="4089ad23-969c-4222-a8ed-e141ec291e80" containerName="ovn-controller" Oct 02 13:09:29 crc kubenswrapper[4724]: I1002 13:09:29.865909 4724 memory_manager.go:354] "RemoveStaleState removing state" podUID="4089ad23-969c-4222-a8ed-e141ec291e80" containerName="ovnkube-controller" Oct 02 13:09:29 crc kubenswrapper[4724]: I1002 13:09:29.865918 4724 memory_manager.go:354] "RemoveStaleState removing state" podUID="4089ad23-969c-4222-a8ed-e141ec291e80" containerName="kube-rbac-proxy-node" Oct 02 13:09:29 crc kubenswrapper[4724]: I1002 13:09:29.865929 4724 memory_manager.go:354] "RemoveStaleState removing state" podUID="4089ad23-969c-4222-a8ed-e141ec291e80" containerName="northd" Oct 02 13:09:29 crc kubenswrapper[4724]: E1002 13:09:29.866070 4724 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4089ad23-969c-4222-a8ed-e141ec291e80" containerName="ovnkube-controller" Oct 02 13:09:29 crc kubenswrapper[4724]: I1002 13:09:29.866082 4724 state_mem.go:107] "Deleted CPUSet assignment" podUID="4089ad23-969c-4222-a8ed-e141ec291e80" containerName="ovnkube-controller" Oct 02 13:09:29 crc kubenswrapper[4724]: E1002 13:09:29.866098 4724 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4089ad23-969c-4222-a8ed-e141ec291e80" containerName="ovnkube-controller" Oct 02 13:09:29 crc kubenswrapper[4724]: I1002 13:09:29.866111 4724 state_mem.go:107] "Deleted CPUSet assignment" podUID="4089ad23-969c-4222-a8ed-e141ec291e80" containerName="ovnkube-controller" Oct 02 13:09:29 crc kubenswrapper[4724]: I1002 13:09:29.866266 4724 memory_manager.go:354] "RemoveStaleState removing state" podUID="4089ad23-969c-4222-a8ed-e141ec291e80" containerName="ovnkube-controller" Oct 02 13:09:29 crc kubenswrapper[4724]: I1002 13:09:29.868736 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-jvbkw" Oct 02 13:09:29 crc kubenswrapper[4724]: I1002 13:09:29.960263 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4089ad23-969c-4222-a8ed-e141ec291e80-host-var-lib-cni-networks-ovn-kubernetes\") pod \"4089ad23-969c-4222-a8ed-e141ec291e80\" (UID: \"4089ad23-969c-4222-a8ed-e141ec291e80\") " Oct 02 13:09:29 crc kubenswrapper[4724]: I1002 13:09:29.960369 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7sxqx\" (UniqueName: \"kubernetes.io/projected/4089ad23-969c-4222-a8ed-e141ec291e80-kube-api-access-7sxqx\") pod \"4089ad23-969c-4222-a8ed-e141ec291e80\" (UID: \"4089ad23-969c-4222-a8ed-e141ec291e80\") " Oct 02 13:09:29 crc kubenswrapper[4724]: I1002 13:09:29.960418 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/4089ad23-969c-4222-a8ed-e141ec291e80-log-socket\") pod \"4089ad23-969c-4222-a8ed-e141ec291e80\" (UID: \"4089ad23-969c-4222-a8ed-e141ec291e80\") " Oct 02 13:09:29 crc kubenswrapper[4724]: I1002 13:09:29.960445 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/4089ad23-969c-4222-a8ed-e141ec291e80-host-cni-netd\") pod \"4089ad23-969c-4222-a8ed-e141ec291e80\" (UID: \"4089ad23-969c-4222-a8ed-e141ec291e80\") " Oct 02 13:09:29 crc kubenswrapper[4724]: I1002 13:09:29.960453 4724 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4089ad23-969c-4222-a8ed-e141ec291e80-host-var-lib-cni-networks-ovn-kubernetes" (OuterVolumeSpecName: "host-var-lib-cni-networks-ovn-kubernetes") pod "4089ad23-969c-4222-a8ed-e141ec291e80" (UID: "4089ad23-969c-4222-a8ed-e141ec291e80"). InnerVolumeSpecName "host-var-lib-cni-networks-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 02 13:09:29 crc kubenswrapper[4724]: I1002 13:09:29.960488 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/4089ad23-969c-4222-a8ed-e141ec291e80-host-kubelet\") pod \"4089ad23-969c-4222-a8ed-e141ec291e80\" (UID: \"4089ad23-969c-4222-a8ed-e141ec291e80\") " Oct 02 13:09:29 crc kubenswrapper[4724]: I1002 13:09:29.960582 4724 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4089ad23-969c-4222-a8ed-e141ec291e80-host-kubelet" (OuterVolumeSpecName: "host-kubelet") pod "4089ad23-969c-4222-a8ed-e141ec291e80" (UID: "4089ad23-969c-4222-a8ed-e141ec291e80"). InnerVolumeSpecName "host-kubelet". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 02 13:09:29 crc kubenswrapper[4724]: I1002 13:09:29.960607 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/4089ad23-969c-4222-a8ed-e141ec291e80-host-run-netns\") pod \"4089ad23-969c-4222-a8ed-e141ec291e80\" (UID: \"4089ad23-969c-4222-a8ed-e141ec291e80\") " Oct 02 13:09:29 crc kubenswrapper[4724]: I1002 13:09:29.960640 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4089ad23-969c-4222-a8ed-e141ec291e80-var-lib-openvswitch\") pod \"4089ad23-969c-4222-a8ed-e141ec291e80\" (UID: \"4089ad23-969c-4222-a8ed-e141ec291e80\") " Oct 02 13:09:29 crc kubenswrapper[4724]: I1002 13:09:29.960679 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/4089ad23-969c-4222-a8ed-e141ec291e80-ovnkube-config\") pod \"4089ad23-969c-4222-a8ed-e141ec291e80\" (UID: \"4089ad23-969c-4222-a8ed-e141ec291e80\") " Oct 02 13:09:29 crc kubenswrapper[4724]: I1002 13:09:29.960700 4724 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4089ad23-969c-4222-a8ed-e141ec291e80-log-socket" (OuterVolumeSpecName: "log-socket") pod "4089ad23-969c-4222-a8ed-e141ec291e80" (UID: "4089ad23-969c-4222-a8ed-e141ec291e80"). InnerVolumeSpecName "log-socket". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 02 13:09:29 crc kubenswrapper[4724]: I1002 13:09:29.960711 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/4089ad23-969c-4222-a8ed-e141ec291e80-ovn-node-metrics-cert\") pod \"4089ad23-969c-4222-a8ed-e141ec291e80\" (UID: \"4089ad23-969c-4222-a8ed-e141ec291e80\") " Oct 02 13:09:29 crc kubenswrapper[4724]: I1002 13:09:29.960776 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/4089ad23-969c-4222-a8ed-e141ec291e80-host-slash\") pod \"4089ad23-969c-4222-a8ed-e141ec291e80\" (UID: \"4089ad23-969c-4222-a8ed-e141ec291e80\") " Oct 02 13:09:29 crc kubenswrapper[4724]: I1002 13:09:29.960841 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/4089ad23-969c-4222-a8ed-e141ec291e80-ovnkube-script-lib\") pod \"4089ad23-969c-4222-a8ed-e141ec291e80\" (UID: \"4089ad23-969c-4222-a8ed-e141ec291e80\") " Oct 02 13:09:29 crc kubenswrapper[4724]: I1002 13:09:29.960887 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4089ad23-969c-4222-a8ed-e141ec291e80-host-run-ovn-kubernetes\") pod \"4089ad23-969c-4222-a8ed-e141ec291e80\" (UID: \"4089ad23-969c-4222-a8ed-e141ec291e80\") " Oct 02 13:09:29 crc kubenswrapper[4724]: I1002 13:09:29.960928 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4089ad23-969c-4222-a8ed-e141ec291e80-etc-openvswitch\") pod \"4089ad23-969c-4222-a8ed-e141ec291e80\" (UID: \"4089ad23-969c-4222-a8ed-e141ec291e80\") " Oct 02 13:09:29 crc kubenswrapper[4724]: I1002 13:09:29.960970 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/4089ad23-969c-4222-a8ed-e141ec291e80-run-systemd\") pod \"4089ad23-969c-4222-a8ed-e141ec291e80\" (UID: \"4089ad23-969c-4222-a8ed-e141ec291e80\") " Oct 02 13:09:29 crc kubenswrapper[4724]: I1002 13:09:29.961002 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/4089ad23-969c-4222-a8ed-e141ec291e80-host-cni-bin\") pod \"4089ad23-969c-4222-a8ed-e141ec291e80\" (UID: \"4089ad23-969c-4222-a8ed-e141ec291e80\") " Oct 02 13:09:29 crc kubenswrapper[4724]: I1002 13:09:29.961033 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4089ad23-969c-4222-a8ed-e141ec291e80-run-openvswitch\") pod \"4089ad23-969c-4222-a8ed-e141ec291e80\" (UID: \"4089ad23-969c-4222-a8ed-e141ec291e80\") " Oct 02 13:09:29 crc kubenswrapper[4724]: I1002 13:09:29.961070 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/4089ad23-969c-4222-a8ed-e141ec291e80-node-log\") pod \"4089ad23-969c-4222-a8ed-e141ec291e80\" (UID: \"4089ad23-969c-4222-a8ed-e141ec291e80\") " Oct 02 13:09:29 crc kubenswrapper[4724]: I1002 13:09:29.961102 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/4089ad23-969c-4222-a8ed-e141ec291e80-run-ovn\") pod \"4089ad23-969c-4222-a8ed-e141ec291e80\" (UID: \"4089ad23-969c-4222-a8ed-e141ec291e80\") " Oct 02 13:09:29 crc kubenswrapper[4724]: I1002 13:09:29.961141 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/4089ad23-969c-4222-a8ed-e141ec291e80-systemd-units\") pod \"4089ad23-969c-4222-a8ed-e141ec291e80\" (UID: \"4089ad23-969c-4222-a8ed-e141ec291e80\") " Oct 02 13:09:29 crc kubenswrapper[4724]: I1002 13:09:29.961209 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/4089ad23-969c-4222-a8ed-e141ec291e80-env-overrides\") pod \"4089ad23-969c-4222-a8ed-e141ec291e80\" (UID: \"4089ad23-969c-4222-a8ed-e141ec291e80\") " Oct 02 13:09:29 crc kubenswrapper[4724]: I1002 13:09:29.961488 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/252e3e73-b46f-414a-96f7-c264bcd63b32-ovn-node-metrics-cert\") pod \"ovnkube-node-jvbkw\" (UID: \"252e3e73-b46f-414a-96f7-c264bcd63b32\") " pod="openshift-ovn-kubernetes/ovnkube-node-jvbkw" Oct 02 13:09:29 crc kubenswrapper[4724]: I1002 13:09:29.961575 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/252e3e73-b46f-414a-96f7-c264bcd63b32-run-ovn\") pod \"ovnkube-node-jvbkw\" (UID: \"252e3e73-b46f-414a-96f7-c264bcd63b32\") " pod="openshift-ovn-kubernetes/ovnkube-node-jvbkw" Oct 02 13:09:29 crc kubenswrapper[4724]: I1002 13:09:29.961619 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/252e3e73-b46f-414a-96f7-c264bcd63b32-ovnkube-config\") pod \"ovnkube-node-jvbkw\" (UID: \"252e3e73-b46f-414a-96f7-c264bcd63b32\") " pod="openshift-ovn-kubernetes/ovnkube-node-jvbkw" Oct 02 13:09:29 crc kubenswrapper[4724]: I1002 13:09:29.961673 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/252e3e73-b46f-414a-96f7-c264bcd63b32-var-lib-openvswitch\") pod \"ovnkube-node-jvbkw\" (UID: \"252e3e73-b46f-414a-96f7-c264bcd63b32\") " pod="openshift-ovn-kubernetes/ovnkube-node-jvbkw" Oct 02 13:09:29 crc kubenswrapper[4724]: I1002 13:09:29.961774 4724 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4089ad23-969c-4222-a8ed-e141ec291e80-var-lib-openvswitch" (OuterVolumeSpecName: "var-lib-openvswitch") pod "4089ad23-969c-4222-a8ed-e141ec291e80" (UID: "4089ad23-969c-4222-a8ed-e141ec291e80"). InnerVolumeSpecName "var-lib-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 02 13:09:29 crc kubenswrapper[4724]: I1002 13:09:29.961831 4724 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4089ad23-969c-4222-a8ed-e141ec291e80-host-run-netns" (OuterVolumeSpecName: "host-run-netns") pod "4089ad23-969c-4222-a8ed-e141ec291e80" (UID: "4089ad23-969c-4222-a8ed-e141ec291e80"). InnerVolumeSpecName "host-run-netns". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 02 13:09:29 crc kubenswrapper[4724]: I1002 13:09:29.962320 4724 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4089ad23-969c-4222-a8ed-e141ec291e80-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "4089ad23-969c-4222-a8ed-e141ec291e80" (UID: "4089ad23-969c-4222-a8ed-e141ec291e80"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 13:09:29 crc kubenswrapper[4724]: I1002 13:09:29.962372 4724 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4089ad23-969c-4222-a8ed-e141ec291e80-host-cni-netd" (OuterVolumeSpecName: "host-cni-netd") pod "4089ad23-969c-4222-a8ed-e141ec291e80" (UID: "4089ad23-969c-4222-a8ed-e141ec291e80"). InnerVolumeSpecName "host-cni-netd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 02 13:09:29 crc kubenswrapper[4724]: I1002 13:09:29.962400 4724 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4089ad23-969c-4222-a8ed-e141ec291e80-node-log" (OuterVolumeSpecName: "node-log") pod "4089ad23-969c-4222-a8ed-e141ec291e80" (UID: "4089ad23-969c-4222-a8ed-e141ec291e80"). InnerVolumeSpecName "node-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 02 13:09:29 crc kubenswrapper[4724]: I1002 13:09:29.962428 4724 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4089ad23-969c-4222-a8ed-e141ec291e80-host-slash" (OuterVolumeSpecName: "host-slash") pod "4089ad23-969c-4222-a8ed-e141ec291e80" (UID: "4089ad23-969c-4222-a8ed-e141ec291e80"). InnerVolumeSpecName "host-slash". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 02 13:09:29 crc kubenswrapper[4724]: I1002 13:09:29.962816 4724 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4089ad23-969c-4222-a8ed-e141ec291e80-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "4089ad23-969c-4222-a8ed-e141ec291e80" (UID: "4089ad23-969c-4222-a8ed-e141ec291e80"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 13:09:29 crc kubenswrapper[4724]: I1002 13:09:29.962871 4724 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4089ad23-969c-4222-a8ed-e141ec291e80-host-run-ovn-kubernetes" (OuterVolumeSpecName: "host-run-ovn-kubernetes") pod "4089ad23-969c-4222-a8ed-e141ec291e80" (UID: "4089ad23-969c-4222-a8ed-e141ec291e80"). InnerVolumeSpecName "host-run-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 02 13:09:29 crc kubenswrapper[4724]: I1002 13:09:29.962903 4724 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4089ad23-969c-4222-a8ed-e141ec291e80-etc-openvswitch" (OuterVolumeSpecName: "etc-openvswitch") pod "4089ad23-969c-4222-a8ed-e141ec291e80" (UID: "4089ad23-969c-4222-a8ed-e141ec291e80"). InnerVolumeSpecName "etc-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 02 13:09:29 crc kubenswrapper[4724]: I1002 13:09:29.963605 4724 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4089ad23-969c-4222-a8ed-e141ec291e80-host-cni-bin" (OuterVolumeSpecName: "host-cni-bin") pod "4089ad23-969c-4222-a8ed-e141ec291e80" (UID: "4089ad23-969c-4222-a8ed-e141ec291e80"). InnerVolumeSpecName "host-cni-bin". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 02 13:09:29 crc kubenswrapper[4724]: I1002 13:09:29.963642 4724 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4089ad23-969c-4222-a8ed-e141ec291e80-run-openvswitch" (OuterVolumeSpecName: "run-openvswitch") pod "4089ad23-969c-4222-a8ed-e141ec291e80" (UID: "4089ad23-969c-4222-a8ed-e141ec291e80"). InnerVolumeSpecName "run-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 02 13:09:29 crc kubenswrapper[4724]: I1002 13:09:29.963658 4724 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4089ad23-969c-4222-a8ed-e141ec291e80-systemd-units" (OuterVolumeSpecName: "systemd-units") pod "4089ad23-969c-4222-a8ed-e141ec291e80" (UID: "4089ad23-969c-4222-a8ed-e141ec291e80"). InnerVolumeSpecName "systemd-units". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 02 13:09:29 crc kubenswrapper[4724]: I1002 13:09:29.963690 4724 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4089ad23-969c-4222-a8ed-e141ec291e80-run-ovn" (OuterVolumeSpecName: "run-ovn") pod "4089ad23-969c-4222-a8ed-e141ec291e80" (UID: "4089ad23-969c-4222-a8ed-e141ec291e80"). InnerVolumeSpecName "run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 02 13:09:29 crc kubenswrapper[4724]: I1002 13:09:29.963785 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/252e3e73-b46f-414a-96f7-c264bcd63b32-etc-openvswitch\") pod \"ovnkube-node-jvbkw\" (UID: \"252e3e73-b46f-414a-96f7-c264bcd63b32\") " pod="openshift-ovn-kubernetes/ovnkube-node-jvbkw" Oct 02 13:09:29 crc kubenswrapper[4724]: I1002 13:09:29.964055 4724 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4089ad23-969c-4222-a8ed-e141ec291e80-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "4089ad23-969c-4222-a8ed-e141ec291e80" (UID: "4089ad23-969c-4222-a8ed-e141ec291e80"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 13:09:29 crc kubenswrapper[4724]: I1002 13:09:29.964209 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/252e3e73-b46f-414a-96f7-c264bcd63b32-systemd-units\") pod \"ovnkube-node-jvbkw\" (UID: \"252e3e73-b46f-414a-96f7-c264bcd63b32\") " pod="openshift-ovn-kubernetes/ovnkube-node-jvbkw" Oct 02 13:09:29 crc kubenswrapper[4724]: I1002 13:09:29.964258 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/252e3e73-b46f-414a-96f7-c264bcd63b32-log-socket\") pod \"ovnkube-node-jvbkw\" (UID: \"252e3e73-b46f-414a-96f7-c264bcd63b32\") " pod="openshift-ovn-kubernetes/ovnkube-node-jvbkw" Oct 02 13:09:29 crc kubenswrapper[4724]: I1002 13:09:29.964302 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/252e3e73-b46f-414a-96f7-c264bcd63b32-env-overrides\") pod \"ovnkube-node-jvbkw\" (UID: \"252e3e73-b46f-414a-96f7-c264bcd63b32\") " pod="openshift-ovn-kubernetes/ovnkube-node-jvbkw" Oct 02 13:09:29 crc kubenswrapper[4724]: I1002 13:09:29.964330 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/252e3e73-b46f-414a-96f7-c264bcd63b32-host-run-ovn-kubernetes\") pod \"ovnkube-node-jvbkw\" (UID: \"252e3e73-b46f-414a-96f7-c264bcd63b32\") " pod="openshift-ovn-kubernetes/ovnkube-node-jvbkw" Oct 02 13:09:29 crc kubenswrapper[4724]: I1002 13:09:29.964354 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/252e3e73-b46f-414a-96f7-c264bcd63b32-host-slash\") pod \"ovnkube-node-jvbkw\" (UID: \"252e3e73-b46f-414a-96f7-c264bcd63b32\") " pod="openshift-ovn-kubernetes/ovnkube-node-jvbkw" Oct 02 13:09:29 crc kubenswrapper[4724]: I1002 13:09:29.964376 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/252e3e73-b46f-414a-96f7-c264bcd63b32-host-kubelet\") pod \"ovnkube-node-jvbkw\" (UID: \"252e3e73-b46f-414a-96f7-c264bcd63b32\") " pod="openshift-ovn-kubernetes/ovnkube-node-jvbkw" Oct 02 13:09:29 crc kubenswrapper[4724]: I1002 13:09:29.964416 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/252e3e73-b46f-414a-96f7-c264bcd63b32-node-log\") pod \"ovnkube-node-jvbkw\" (UID: \"252e3e73-b46f-414a-96f7-c264bcd63b32\") " pod="openshift-ovn-kubernetes/ovnkube-node-jvbkw" Oct 02 13:09:29 crc kubenswrapper[4724]: I1002 13:09:29.964440 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n9249\" (UniqueName: \"kubernetes.io/projected/252e3e73-b46f-414a-96f7-c264bcd63b32-kube-api-access-n9249\") pod \"ovnkube-node-jvbkw\" (UID: \"252e3e73-b46f-414a-96f7-c264bcd63b32\") " pod="openshift-ovn-kubernetes/ovnkube-node-jvbkw" Oct 02 13:09:29 crc kubenswrapper[4724]: I1002 13:09:29.964474 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/252e3e73-b46f-414a-96f7-c264bcd63b32-run-systemd\") pod \"ovnkube-node-jvbkw\" (UID: \"252e3e73-b46f-414a-96f7-c264bcd63b32\") " pod="openshift-ovn-kubernetes/ovnkube-node-jvbkw" Oct 02 13:09:29 crc kubenswrapper[4724]: I1002 13:09:29.964500 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/252e3e73-b46f-414a-96f7-c264bcd63b32-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-jvbkw\" (UID: \"252e3e73-b46f-414a-96f7-c264bcd63b32\") " pod="openshift-ovn-kubernetes/ovnkube-node-jvbkw" Oct 02 13:09:29 crc kubenswrapper[4724]: I1002 13:09:29.964530 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/252e3e73-b46f-414a-96f7-c264bcd63b32-host-run-netns\") pod \"ovnkube-node-jvbkw\" (UID: \"252e3e73-b46f-414a-96f7-c264bcd63b32\") " pod="openshift-ovn-kubernetes/ovnkube-node-jvbkw" Oct 02 13:09:29 crc kubenswrapper[4724]: I1002 13:09:29.964592 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/252e3e73-b46f-414a-96f7-c264bcd63b32-host-cni-bin\") pod \"ovnkube-node-jvbkw\" (UID: \"252e3e73-b46f-414a-96f7-c264bcd63b32\") " pod="openshift-ovn-kubernetes/ovnkube-node-jvbkw" Oct 02 13:09:29 crc kubenswrapper[4724]: I1002 13:09:29.964618 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/252e3e73-b46f-414a-96f7-c264bcd63b32-ovnkube-script-lib\") pod \"ovnkube-node-jvbkw\" (UID: \"252e3e73-b46f-414a-96f7-c264bcd63b32\") " pod="openshift-ovn-kubernetes/ovnkube-node-jvbkw" Oct 02 13:09:29 crc kubenswrapper[4724]: I1002 13:09:29.964646 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/252e3e73-b46f-414a-96f7-c264bcd63b32-host-cni-netd\") pod \"ovnkube-node-jvbkw\" (UID: \"252e3e73-b46f-414a-96f7-c264bcd63b32\") " pod="openshift-ovn-kubernetes/ovnkube-node-jvbkw" Oct 02 13:09:29 crc kubenswrapper[4724]: I1002 13:09:29.964688 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/252e3e73-b46f-414a-96f7-c264bcd63b32-run-openvswitch\") pod \"ovnkube-node-jvbkw\" (UID: \"252e3e73-b46f-414a-96f7-c264bcd63b32\") " pod="openshift-ovn-kubernetes/ovnkube-node-jvbkw" Oct 02 13:09:29 crc kubenswrapper[4724]: I1002 13:09:29.964751 4724 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/4089ad23-969c-4222-a8ed-e141ec291e80-ovnkube-config\") on node \"crc\" DevicePath \"\"" Oct 02 13:09:29 crc kubenswrapper[4724]: I1002 13:09:29.964768 4724 reconciler_common.go:293] "Volume detached for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/4089ad23-969c-4222-a8ed-e141ec291e80-host-slash\") on node \"crc\" DevicePath \"\"" Oct 02 13:09:29 crc kubenswrapper[4724]: I1002 13:09:29.964783 4724 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/4089ad23-969c-4222-a8ed-e141ec291e80-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Oct 02 13:09:29 crc kubenswrapper[4724]: I1002 13:09:29.964798 4724 reconciler_common.go:293] "Volume detached for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4089ad23-969c-4222-a8ed-e141ec291e80-host-run-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Oct 02 13:09:29 crc kubenswrapper[4724]: I1002 13:09:29.964813 4724 reconciler_common.go:293] "Volume detached for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4089ad23-969c-4222-a8ed-e141ec291e80-etc-openvswitch\") on node \"crc\" DevicePath \"\"" Oct 02 13:09:29 crc kubenswrapper[4724]: I1002 13:09:29.964827 4724 reconciler_common.go:293] "Volume detached for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/4089ad23-969c-4222-a8ed-e141ec291e80-host-cni-bin\") on node \"crc\" DevicePath \"\"" Oct 02 13:09:29 crc kubenswrapper[4724]: I1002 13:09:29.964840 4724 reconciler_common.go:293] "Volume detached for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4089ad23-969c-4222-a8ed-e141ec291e80-run-openvswitch\") on node \"crc\" DevicePath \"\"" Oct 02 13:09:29 crc kubenswrapper[4724]: I1002 13:09:29.964853 4724 reconciler_common.go:293] "Volume detached for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/4089ad23-969c-4222-a8ed-e141ec291e80-node-log\") on node \"crc\" DevicePath \"\"" Oct 02 13:09:29 crc kubenswrapper[4724]: I1002 13:09:29.964868 4724 reconciler_common.go:293] "Volume detached for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/4089ad23-969c-4222-a8ed-e141ec291e80-run-ovn\") on node \"crc\" DevicePath \"\"" Oct 02 13:09:29 crc kubenswrapper[4724]: I1002 13:09:29.964881 4724 reconciler_common.go:293] "Volume detached for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/4089ad23-969c-4222-a8ed-e141ec291e80-systemd-units\") on node \"crc\" DevicePath \"\"" Oct 02 13:09:29 crc kubenswrapper[4724]: I1002 13:09:29.964894 4724 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/4089ad23-969c-4222-a8ed-e141ec291e80-env-overrides\") on node \"crc\" DevicePath \"\"" Oct 02 13:09:29 crc kubenswrapper[4724]: I1002 13:09:29.964907 4724 reconciler_common.go:293] "Volume detached for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4089ad23-969c-4222-a8ed-e141ec291e80-host-var-lib-cni-networks-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Oct 02 13:09:29 crc kubenswrapper[4724]: I1002 13:09:29.964920 4724 reconciler_common.go:293] "Volume detached for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/4089ad23-969c-4222-a8ed-e141ec291e80-log-socket\") on node \"crc\" DevicePath \"\"" Oct 02 13:09:29 crc kubenswrapper[4724]: I1002 13:09:29.964933 4724 reconciler_common.go:293] "Volume detached for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/4089ad23-969c-4222-a8ed-e141ec291e80-host-cni-netd\") on node \"crc\" DevicePath \"\"" Oct 02 13:09:29 crc kubenswrapper[4724]: I1002 13:09:29.964947 4724 reconciler_common.go:293] "Volume detached for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/4089ad23-969c-4222-a8ed-e141ec291e80-host-kubelet\") on node \"crc\" DevicePath \"\"" Oct 02 13:09:29 crc kubenswrapper[4724]: I1002 13:09:29.964960 4724 reconciler_common.go:293] "Volume detached for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/4089ad23-969c-4222-a8ed-e141ec291e80-host-run-netns\") on node \"crc\" DevicePath \"\"" Oct 02 13:09:29 crc kubenswrapper[4724]: I1002 13:09:29.964973 4724 reconciler_common.go:293] "Volume detached for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4089ad23-969c-4222-a8ed-e141ec291e80-var-lib-openvswitch\") on node \"crc\" DevicePath \"\"" Oct 02 13:09:29 crc kubenswrapper[4724]: I1002 13:09:29.967685 4724 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4089ad23-969c-4222-a8ed-e141ec291e80-kube-api-access-7sxqx" (OuterVolumeSpecName: "kube-api-access-7sxqx") pod "4089ad23-969c-4222-a8ed-e141ec291e80" (UID: "4089ad23-969c-4222-a8ed-e141ec291e80"). InnerVolumeSpecName "kube-api-access-7sxqx". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 13:09:29 crc kubenswrapper[4724]: I1002 13:09:29.968308 4724 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4089ad23-969c-4222-a8ed-e141ec291e80-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "4089ad23-969c-4222-a8ed-e141ec291e80" (UID: "4089ad23-969c-4222-a8ed-e141ec291e80"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 13:09:29 crc kubenswrapper[4724]: I1002 13:09:29.975688 4724 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4089ad23-969c-4222-a8ed-e141ec291e80-run-systemd" (OuterVolumeSpecName: "run-systemd") pod "4089ad23-969c-4222-a8ed-e141ec291e80" (UID: "4089ad23-969c-4222-a8ed-e141ec291e80"). InnerVolumeSpecName "run-systemd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 02 13:09:30 crc kubenswrapper[4724]: I1002 13:09:30.065670 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/252e3e73-b46f-414a-96f7-c264bcd63b32-host-cni-bin\") pod \"ovnkube-node-jvbkw\" (UID: \"252e3e73-b46f-414a-96f7-c264bcd63b32\") " pod="openshift-ovn-kubernetes/ovnkube-node-jvbkw" Oct 02 13:09:30 crc kubenswrapper[4724]: I1002 13:09:30.065716 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/252e3e73-b46f-414a-96f7-c264bcd63b32-ovnkube-script-lib\") pod \"ovnkube-node-jvbkw\" (UID: \"252e3e73-b46f-414a-96f7-c264bcd63b32\") " pod="openshift-ovn-kubernetes/ovnkube-node-jvbkw" Oct 02 13:09:30 crc kubenswrapper[4724]: I1002 13:09:30.065733 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/252e3e73-b46f-414a-96f7-c264bcd63b32-host-cni-netd\") pod \"ovnkube-node-jvbkw\" (UID: \"252e3e73-b46f-414a-96f7-c264bcd63b32\") " pod="openshift-ovn-kubernetes/ovnkube-node-jvbkw" Oct 02 13:09:30 crc kubenswrapper[4724]: I1002 13:09:30.065749 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/252e3e73-b46f-414a-96f7-c264bcd63b32-run-openvswitch\") pod \"ovnkube-node-jvbkw\" (UID: \"252e3e73-b46f-414a-96f7-c264bcd63b32\") " pod="openshift-ovn-kubernetes/ovnkube-node-jvbkw" Oct 02 13:09:30 crc kubenswrapper[4724]: I1002 13:09:30.065778 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/252e3e73-b46f-414a-96f7-c264bcd63b32-ovn-node-metrics-cert\") pod \"ovnkube-node-jvbkw\" (UID: \"252e3e73-b46f-414a-96f7-c264bcd63b32\") " pod="openshift-ovn-kubernetes/ovnkube-node-jvbkw" Oct 02 13:09:30 crc kubenswrapper[4724]: I1002 13:09:30.065807 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/252e3e73-b46f-414a-96f7-c264bcd63b32-run-ovn\") pod \"ovnkube-node-jvbkw\" (UID: \"252e3e73-b46f-414a-96f7-c264bcd63b32\") " pod="openshift-ovn-kubernetes/ovnkube-node-jvbkw" Oct 02 13:09:30 crc kubenswrapper[4724]: I1002 13:09:30.065837 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/252e3e73-b46f-414a-96f7-c264bcd63b32-ovnkube-config\") pod \"ovnkube-node-jvbkw\" (UID: \"252e3e73-b46f-414a-96f7-c264bcd63b32\") " pod="openshift-ovn-kubernetes/ovnkube-node-jvbkw" Oct 02 13:09:30 crc kubenswrapper[4724]: I1002 13:09:30.065869 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/252e3e73-b46f-414a-96f7-c264bcd63b32-var-lib-openvswitch\") pod \"ovnkube-node-jvbkw\" (UID: \"252e3e73-b46f-414a-96f7-c264bcd63b32\") " pod="openshift-ovn-kubernetes/ovnkube-node-jvbkw" Oct 02 13:09:30 crc kubenswrapper[4724]: I1002 13:09:30.065885 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/252e3e73-b46f-414a-96f7-c264bcd63b32-etc-openvswitch\") pod \"ovnkube-node-jvbkw\" (UID: \"252e3e73-b46f-414a-96f7-c264bcd63b32\") " pod="openshift-ovn-kubernetes/ovnkube-node-jvbkw" Oct 02 13:09:30 crc kubenswrapper[4724]: I1002 13:09:30.065901 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/252e3e73-b46f-414a-96f7-c264bcd63b32-systemd-units\") pod \"ovnkube-node-jvbkw\" (UID: \"252e3e73-b46f-414a-96f7-c264bcd63b32\") " pod="openshift-ovn-kubernetes/ovnkube-node-jvbkw" Oct 02 13:09:30 crc kubenswrapper[4724]: I1002 13:09:30.065920 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/252e3e73-b46f-414a-96f7-c264bcd63b32-log-socket\") pod \"ovnkube-node-jvbkw\" (UID: \"252e3e73-b46f-414a-96f7-c264bcd63b32\") " pod="openshift-ovn-kubernetes/ovnkube-node-jvbkw" Oct 02 13:09:30 crc kubenswrapper[4724]: I1002 13:09:30.065944 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/252e3e73-b46f-414a-96f7-c264bcd63b32-env-overrides\") pod \"ovnkube-node-jvbkw\" (UID: \"252e3e73-b46f-414a-96f7-c264bcd63b32\") " pod="openshift-ovn-kubernetes/ovnkube-node-jvbkw" Oct 02 13:09:30 crc kubenswrapper[4724]: I1002 13:09:30.065960 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/252e3e73-b46f-414a-96f7-c264bcd63b32-host-run-ovn-kubernetes\") pod \"ovnkube-node-jvbkw\" (UID: \"252e3e73-b46f-414a-96f7-c264bcd63b32\") " pod="openshift-ovn-kubernetes/ovnkube-node-jvbkw" Oct 02 13:09:30 crc kubenswrapper[4724]: I1002 13:09:30.065948 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/252e3e73-b46f-414a-96f7-c264bcd63b32-host-cni-netd\") pod \"ovnkube-node-jvbkw\" (UID: \"252e3e73-b46f-414a-96f7-c264bcd63b32\") " pod="openshift-ovn-kubernetes/ovnkube-node-jvbkw" Oct 02 13:09:30 crc kubenswrapper[4724]: I1002 13:09:30.066017 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/252e3e73-b46f-414a-96f7-c264bcd63b32-host-slash\") pod \"ovnkube-node-jvbkw\" (UID: \"252e3e73-b46f-414a-96f7-c264bcd63b32\") " pod="openshift-ovn-kubernetes/ovnkube-node-jvbkw" Oct 02 13:09:30 crc kubenswrapper[4724]: I1002 13:09:30.065976 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/252e3e73-b46f-414a-96f7-c264bcd63b32-host-slash\") pod \"ovnkube-node-jvbkw\" (UID: \"252e3e73-b46f-414a-96f7-c264bcd63b32\") " pod="openshift-ovn-kubernetes/ovnkube-node-jvbkw" Oct 02 13:09:30 crc kubenswrapper[4724]: I1002 13:09:30.066059 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/252e3e73-b46f-414a-96f7-c264bcd63b32-run-ovn\") pod \"ovnkube-node-jvbkw\" (UID: \"252e3e73-b46f-414a-96f7-c264bcd63b32\") " pod="openshift-ovn-kubernetes/ovnkube-node-jvbkw" Oct 02 13:09:30 crc kubenswrapper[4724]: I1002 13:09:30.066105 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/252e3e73-b46f-414a-96f7-c264bcd63b32-host-kubelet\") pod \"ovnkube-node-jvbkw\" (UID: \"252e3e73-b46f-414a-96f7-c264bcd63b32\") " pod="openshift-ovn-kubernetes/ovnkube-node-jvbkw" Oct 02 13:09:30 crc kubenswrapper[4724]: I1002 13:09:30.066196 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/252e3e73-b46f-414a-96f7-c264bcd63b32-node-log\") pod \"ovnkube-node-jvbkw\" (UID: \"252e3e73-b46f-414a-96f7-c264bcd63b32\") " pod="openshift-ovn-kubernetes/ovnkube-node-jvbkw" Oct 02 13:09:30 crc kubenswrapper[4724]: I1002 13:09:30.066228 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n9249\" (UniqueName: \"kubernetes.io/projected/252e3e73-b46f-414a-96f7-c264bcd63b32-kube-api-access-n9249\") pod \"ovnkube-node-jvbkw\" (UID: \"252e3e73-b46f-414a-96f7-c264bcd63b32\") " pod="openshift-ovn-kubernetes/ovnkube-node-jvbkw" Oct 02 13:09:30 crc kubenswrapper[4724]: I1002 13:09:30.066236 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/252e3e73-b46f-414a-96f7-c264bcd63b32-systemd-units\") pod \"ovnkube-node-jvbkw\" (UID: \"252e3e73-b46f-414a-96f7-c264bcd63b32\") " pod="openshift-ovn-kubernetes/ovnkube-node-jvbkw" Oct 02 13:09:30 crc kubenswrapper[4724]: I1002 13:09:30.066278 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/252e3e73-b46f-414a-96f7-c264bcd63b32-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-jvbkw\" (UID: \"252e3e73-b46f-414a-96f7-c264bcd63b32\") " pod="openshift-ovn-kubernetes/ovnkube-node-jvbkw" Oct 02 13:09:30 crc kubenswrapper[4724]: I1002 13:09:30.066306 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/252e3e73-b46f-414a-96f7-c264bcd63b32-etc-openvswitch\") pod \"ovnkube-node-jvbkw\" (UID: \"252e3e73-b46f-414a-96f7-c264bcd63b32\") " pod="openshift-ovn-kubernetes/ovnkube-node-jvbkw" Oct 02 13:09:30 crc kubenswrapper[4724]: I1002 13:09:30.066325 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/252e3e73-b46f-414a-96f7-c264bcd63b32-run-systemd\") pod \"ovnkube-node-jvbkw\" (UID: \"252e3e73-b46f-414a-96f7-c264bcd63b32\") " pod="openshift-ovn-kubernetes/ovnkube-node-jvbkw" Oct 02 13:09:30 crc kubenswrapper[4724]: I1002 13:09:30.066349 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/252e3e73-b46f-414a-96f7-c264bcd63b32-log-socket\") pod \"ovnkube-node-jvbkw\" (UID: \"252e3e73-b46f-414a-96f7-c264bcd63b32\") " pod="openshift-ovn-kubernetes/ovnkube-node-jvbkw" Oct 02 13:09:30 crc kubenswrapper[4724]: I1002 13:09:30.066371 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/252e3e73-b46f-414a-96f7-c264bcd63b32-host-run-netns\") pod \"ovnkube-node-jvbkw\" (UID: \"252e3e73-b46f-414a-96f7-c264bcd63b32\") " pod="openshift-ovn-kubernetes/ovnkube-node-jvbkw" Oct 02 13:09:30 crc kubenswrapper[4724]: I1002 13:09:30.066402 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/252e3e73-b46f-414a-96f7-c264bcd63b32-host-run-netns\") pod \"ovnkube-node-jvbkw\" (UID: \"252e3e73-b46f-414a-96f7-c264bcd63b32\") " pod="openshift-ovn-kubernetes/ovnkube-node-jvbkw" Oct 02 13:09:30 crc kubenswrapper[4724]: I1002 13:09:30.066223 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/252e3e73-b46f-414a-96f7-c264bcd63b32-host-cni-bin\") pod \"ovnkube-node-jvbkw\" (UID: \"252e3e73-b46f-414a-96f7-c264bcd63b32\") " pod="openshift-ovn-kubernetes/ovnkube-node-jvbkw" Oct 02 13:09:30 crc kubenswrapper[4724]: I1002 13:09:30.066436 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/252e3e73-b46f-414a-96f7-c264bcd63b32-host-run-ovn-kubernetes\") pod \"ovnkube-node-jvbkw\" (UID: \"252e3e73-b46f-414a-96f7-c264bcd63b32\") " pod="openshift-ovn-kubernetes/ovnkube-node-jvbkw" Oct 02 13:09:30 crc kubenswrapper[4724]: I1002 13:09:30.066282 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/252e3e73-b46f-414a-96f7-c264bcd63b32-run-openvswitch\") pod \"ovnkube-node-jvbkw\" (UID: \"252e3e73-b46f-414a-96f7-c264bcd63b32\") " pod="openshift-ovn-kubernetes/ovnkube-node-jvbkw" Oct 02 13:09:30 crc kubenswrapper[4724]: I1002 13:09:30.066284 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/252e3e73-b46f-414a-96f7-c264bcd63b32-var-lib-openvswitch\") pod \"ovnkube-node-jvbkw\" (UID: \"252e3e73-b46f-414a-96f7-c264bcd63b32\") " pod="openshift-ovn-kubernetes/ovnkube-node-jvbkw" Oct 02 13:09:30 crc kubenswrapper[4724]: I1002 13:09:30.067047 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/252e3e73-b46f-414a-96f7-c264bcd63b32-ovnkube-config\") pod \"ovnkube-node-jvbkw\" (UID: \"252e3e73-b46f-414a-96f7-c264bcd63b32\") " pod="openshift-ovn-kubernetes/ovnkube-node-jvbkw" Oct 02 13:09:30 crc kubenswrapper[4724]: I1002 13:09:30.067375 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/252e3e73-b46f-414a-96f7-c264bcd63b32-ovnkube-script-lib\") pod \"ovnkube-node-jvbkw\" (UID: \"252e3e73-b46f-414a-96f7-c264bcd63b32\") " pod="openshift-ovn-kubernetes/ovnkube-node-jvbkw" Oct 02 13:09:30 crc kubenswrapper[4724]: I1002 13:09:30.067490 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/252e3e73-b46f-414a-96f7-c264bcd63b32-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-jvbkw\" (UID: \"252e3e73-b46f-414a-96f7-c264bcd63b32\") " pod="openshift-ovn-kubernetes/ovnkube-node-jvbkw" Oct 02 13:09:30 crc kubenswrapper[4724]: I1002 13:09:30.067523 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/252e3e73-b46f-414a-96f7-c264bcd63b32-env-overrides\") pod \"ovnkube-node-jvbkw\" (UID: \"252e3e73-b46f-414a-96f7-c264bcd63b32\") " pod="openshift-ovn-kubernetes/ovnkube-node-jvbkw" Oct 02 13:09:30 crc kubenswrapper[4724]: I1002 13:09:30.067579 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/252e3e73-b46f-414a-96f7-c264bcd63b32-run-systemd\") pod \"ovnkube-node-jvbkw\" (UID: \"252e3e73-b46f-414a-96f7-c264bcd63b32\") " pod="openshift-ovn-kubernetes/ovnkube-node-jvbkw" Oct 02 13:09:30 crc kubenswrapper[4724]: I1002 13:09:30.067609 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/252e3e73-b46f-414a-96f7-c264bcd63b32-host-kubelet\") pod \"ovnkube-node-jvbkw\" (UID: \"252e3e73-b46f-414a-96f7-c264bcd63b32\") " pod="openshift-ovn-kubernetes/ovnkube-node-jvbkw" Oct 02 13:09:30 crc kubenswrapper[4724]: I1002 13:09:30.067624 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/252e3e73-b46f-414a-96f7-c264bcd63b32-node-log\") pod \"ovnkube-node-jvbkw\" (UID: \"252e3e73-b46f-414a-96f7-c264bcd63b32\") " pod="openshift-ovn-kubernetes/ovnkube-node-jvbkw" Oct 02 13:09:30 crc kubenswrapper[4724]: I1002 13:09:30.067723 4724 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7sxqx\" (UniqueName: \"kubernetes.io/projected/4089ad23-969c-4222-a8ed-e141ec291e80-kube-api-access-7sxqx\") on node \"crc\" DevicePath \"\"" Oct 02 13:09:30 crc kubenswrapper[4724]: I1002 13:09:30.067746 4724 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/4089ad23-969c-4222-a8ed-e141ec291e80-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Oct 02 13:09:30 crc kubenswrapper[4724]: I1002 13:09:30.067761 4724 reconciler_common.go:293] "Volume detached for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/4089ad23-969c-4222-a8ed-e141ec291e80-run-systemd\") on node \"crc\" DevicePath \"\"" Oct 02 13:09:30 crc kubenswrapper[4724]: I1002 13:09:30.073062 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/252e3e73-b46f-414a-96f7-c264bcd63b32-ovn-node-metrics-cert\") pod \"ovnkube-node-jvbkw\" (UID: \"252e3e73-b46f-414a-96f7-c264bcd63b32\") " pod="openshift-ovn-kubernetes/ovnkube-node-jvbkw" Oct 02 13:09:30 crc kubenswrapper[4724]: I1002 13:09:30.083463 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n9249\" (UniqueName: \"kubernetes.io/projected/252e3e73-b46f-414a-96f7-c264bcd63b32-kube-api-access-n9249\") pod \"ovnkube-node-jvbkw\" (UID: \"252e3e73-b46f-414a-96f7-c264bcd63b32\") " pod="openshift-ovn-kubernetes/ovnkube-node-jvbkw" Oct 02 13:09:30 crc kubenswrapper[4724]: I1002 13:09:30.188806 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-jvbkw" Oct 02 13:09:30 crc kubenswrapper[4724]: I1002 13:09:30.703427 4724 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-w58lt_4089ad23-969c-4222-a8ed-e141ec291e80/ovn-acl-logging/0.log" Oct 02 13:09:30 crc kubenswrapper[4724]: I1002 13:09:30.704450 4724 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-w58lt_4089ad23-969c-4222-a8ed-e141ec291e80/ovn-controller/0.log" Oct 02 13:09:30 crc kubenswrapper[4724]: I1002 13:09:30.705044 4724 generic.go:334] "Generic (PLEG): container finished" podID="4089ad23-969c-4222-a8ed-e141ec291e80" containerID="3c6cd68ea9e76b6dcb1649fafb808953b30389fcaee674b611c655c239a36e7f" exitCode=0 Oct 02 13:09:30 crc kubenswrapper[4724]: I1002 13:09:30.705083 4724 generic.go:334] "Generic (PLEG): container finished" podID="4089ad23-969c-4222-a8ed-e141ec291e80" containerID="300ea92580dcf51b685b378397b0dd067899d424f4394bce09a8242415acba64" exitCode=0 Oct 02 13:09:30 crc kubenswrapper[4724]: I1002 13:09:30.705095 4724 generic.go:334] "Generic (PLEG): container finished" podID="4089ad23-969c-4222-a8ed-e141ec291e80" containerID="6bc33e00e91c1177fe2762cb5295c78f4b00c1861d44446269e2e7668b25ac83" exitCode=0 Oct 02 13:09:30 crc kubenswrapper[4724]: I1002 13:09:30.705112 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-w58lt" event={"ID":"4089ad23-969c-4222-a8ed-e141ec291e80","Type":"ContainerDied","Data":"3c6cd68ea9e76b6dcb1649fafb808953b30389fcaee674b611c655c239a36e7f"} Oct 02 13:09:30 crc kubenswrapper[4724]: I1002 13:09:30.705160 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-w58lt" event={"ID":"4089ad23-969c-4222-a8ed-e141ec291e80","Type":"ContainerDied","Data":"300ea92580dcf51b685b378397b0dd067899d424f4394bce09a8242415acba64"} Oct 02 13:09:30 crc kubenswrapper[4724]: I1002 13:09:30.705173 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-w58lt" event={"ID":"4089ad23-969c-4222-a8ed-e141ec291e80","Type":"ContainerDied","Data":"6bc33e00e91c1177fe2762cb5295c78f4b00c1861d44446269e2e7668b25ac83"} Oct 02 13:09:30 crc kubenswrapper[4724]: I1002 13:09:30.705183 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-w58lt" event={"ID":"4089ad23-969c-4222-a8ed-e141ec291e80","Type":"ContainerDied","Data":"679ff53a498043077c9aa75da8fcbe9b55c82ff0489a5f61898dbbcfc239ed1f"} Oct 02 13:09:30 crc kubenswrapper[4724]: I1002 13:09:30.705199 4724 scope.go:117] "RemoveContainer" containerID="8e820afb07a66f1eb4b72ab3a92ba9dbd09e42cf1a2994c9dbca241a450e51fb" Oct 02 13:09:30 crc kubenswrapper[4724]: I1002 13:09:30.705194 4724 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-w58lt" Oct 02 13:09:30 crc kubenswrapper[4724]: I1002 13:09:30.706994 4724 generic.go:334] "Generic (PLEG): container finished" podID="252e3e73-b46f-414a-96f7-c264bcd63b32" containerID="30c20e4ee24193d3ad27ad365cb507de31df62fad761442451ac2b2f396d15fa" exitCode=0 Oct 02 13:09:30 crc kubenswrapper[4724]: I1002 13:09:30.707017 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jvbkw" event={"ID":"252e3e73-b46f-414a-96f7-c264bcd63b32","Type":"ContainerDied","Data":"30c20e4ee24193d3ad27ad365cb507de31df62fad761442451ac2b2f396d15fa"} Oct 02 13:09:30 crc kubenswrapper[4724]: I1002 13:09:30.707061 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jvbkw" event={"ID":"252e3e73-b46f-414a-96f7-c264bcd63b32","Type":"ContainerStarted","Data":"c7294b515e7186256d26c727b75f72d35135419201fce836411e3f834ad9cd16"} Oct 02 13:09:30 crc kubenswrapper[4724]: I1002 13:09:30.709033 4724 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-pr276_c67e3c27-01fd-47f5-a7fb-a2ae1e7753d4/kube-multus/2.log" Oct 02 13:09:30 crc kubenswrapper[4724]: I1002 13:09:30.729710 4724 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-w58lt"] Oct 02 13:09:30 crc kubenswrapper[4724]: I1002 13:09:30.730482 4724 scope.go:117] "RemoveContainer" containerID="3c6cd68ea9e76b6dcb1649fafb808953b30389fcaee674b611c655c239a36e7f" Oct 02 13:09:30 crc kubenswrapper[4724]: I1002 13:09:30.731602 4724 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-w58lt"] Oct 02 13:09:30 crc kubenswrapper[4724]: I1002 13:09:30.744451 4724 scope.go:117] "RemoveContainer" containerID="300ea92580dcf51b685b378397b0dd067899d424f4394bce09a8242415acba64" Oct 02 13:09:30 crc kubenswrapper[4724]: I1002 13:09:30.773446 4724 scope.go:117] "RemoveContainer" containerID="6bc33e00e91c1177fe2762cb5295c78f4b00c1861d44446269e2e7668b25ac83" Oct 02 13:09:30 crc kubenswrapper[4724]: I1002 13:09:30.787905 4724 scope.go:117] "RemoveContainer" containerID="b3940d1762b165816101228f10689c4897ca3909b295c51368935e2d28d32e9c" Oct 02 13:09:30 crc kubenswrapper[4724]: I1002 13:09:30.812414 4724 scope.go:117] "RemoveContainer" containerID="197d101ac60dc4e22bd91f0bccaf0c5d9461d2bb94425b5c6d4ca75aad0f7e25" Oct 02 13:09:30 crc kubenswrapper[4724]: I1002 13:09:30.826395 4724 scope.go:117] "RemoveContainer" containerID="1f4412d5281a9c03ea5b251d17371e363b9c5a970bad0db12adf3f75ddd1f955" Oct 02 13:09:30 crc kubenswrapper[4724]: I1002 13:09:30.842142 4724 scope.go:117] "RemoveContainer" containerID="4186fbd2d282edbe581a4d8d3616dff86c3e6cdbe797999fa57f6034870e7742" Oct 02 13:09:30 crc kubenswrapper[4724]: I1002 13:09:30.856650 4724 scope.go:117] "RemoveContainer" containerID="9c7841b20007f0436754e669fb6f1a85de1f0ced1c0b2686ae0a20b56ab878f2" Oct 02 13:09:30 crc kubenswrapper[4724]: I1002 13:09:30.870812 4724 scope.go:117] "RemoveContainer" containerID="8e820afb07a66f1eb4b72ab3a92ba9dbd09e42cf1a2994c9dbca241a450e51fb" Oct 02 13:09:30 crc kubenswrapper[4724]: E1002 13:09:30.871167 4724 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8e820afb07a66f1eb4b72ab3a92ba9dbd09e42cf1a2994c9dbca241a450e51fb\": container with ID starting with 8e820afb07a66f1eb4b72ab3a92ba9dbd09e42cf1a2994c9dbca241a450e51fb not found: ID does not exist" containerID="8e820afb07a66f1eb4b72ab3a92ba9dbd09e42cf1a2994c9dbca241a450e51fb" Oct 02 13:09:30 crc kubenswrapper[4724]: I1002 13:09:30.871232 4724 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8e820afb07a66f1eb4b72ab3a92ba9dbd09e42cf1a2994c9dbca241a450e51fb"} err="failed to get container status \"8e820afb07a66f1eb4b72ab3a92ba9dbd09e42cf1a2994c9dbca241a450e51fb\": rpc error: code = NotFound desc = could not find container \"8e820afb07a66f1eb4b72ab3a92ba9dbd09e42cf1a2994c9dbca241a450e51fb\": container with ID starting with 8e820afb07a66f1eb4b72ab3a92ba9dbd09e42cf1a2994c9dbca241a450e51fb not found: ID does not exist" Oct 02 13:09:30 crc kubenswrapper[4724]: I1002 13:09:30.871256 4724 scope.go:117] "RemoveContainer" containerID="3c6cd68ea9e76b6dcb1649fafb808953b30389fcaee674b611c655c239a36e7f" Oct 02 13:09:30 crc kubenswrapper[4724]: E1002 13:09:30.871519 4724 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3c6cd68ea9e76b6dcb1649fafb808953b30389fcaee674b611c655c239a36e7f\": container with ID starting with 3c6cd68ea9e76b6dcb1649fafb808953b30389fcaee674b611c655c239a36e7f not found: ID does not exist" containerID="3c6cd68ea9e76b6dcb1649fafb808953b30389fcaee674b611c655c239a36e7f" Oct 02 13:09:30 crc kubenswrapper[4724]: I1002 13:09:30.871586 4724 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3c6cd68ea9e76b6dcb1649fafb808953b30389fcaee674b611c655c239a36e7f"} err="failed to get container status \"3c6cd68ea9e76b6dcb1649fafb808953b30389fcaee674b611c655c239a36e7f\": rpc error: code = NotFound desc = could not find container \"3c6cd68ea9e76b6dcb1649fafb808953b30389fcaee674b611c655c239a36e7f\": container with ID starting with 3c6cd68ea9e76b6dcb1649fafb808953b30389fcaee674b611c655c239a36e7f not found: ID does not exist" Oct 02 13:09:30 crc kubenswrapper[4724]: I1002 13:09:30.871615 4724 scope.go:117] "RemoveContainer" containerID="300ea92580dcf51b685b378397b0dd067899d424f4394bce09a8242415acba64" Oct 02 13:09:30 crc kubenswrapper[4724]: E1002 13:09:30.871966 4724 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"300ea92580dcf51b685b378397b0dd067899d424f4394bce09a8242415acba64\": container with ID starting with 300ea92580dcf51b685b378397b0dd067899d424f4394bce09a8242415acba64 not found: ID does not exist" containerID="300ea92580dcf51b685b378397b0dd067899d424f4394bce09a8242415acba64" Oct 02 13:09:30 crc kubenswrapper[4724]: I1002 13:09:30.871994 4724 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"300ea92580dcf51b685b378397b0dd067899d424f4394bce09a8242415acba64"} err="failed to get container status \"300ea92580dcf51b685b378397b0dd067899d424f4394bce09a8242415acba64\": rpc error: code = NotFound desc = could not find container \"300ea92580dcf51b685b378397b0dd067899d424f4394bce09a8242415acba64\": container with ID starting with 300ea92580dcf51b685b378397b0dd067899d424f4394bce09a8242415acba64 not found: ID does not exist" Oct 02 13:09:30 crc kubenswrapper[4724]: I1002 13:09:30.872009 4724 scope.go:117] "RemoveContainer" containerID="6bc33e00e91c1177fe2762cb5295c78f4b00c1861d44446269e2e7668b25ac83" Oct 02 13:09:30 crc kubenswrapper[4724]: E1002 13:09:30.874164 4724 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6bc33e00e91c1177fe2762cb5295c78f4b00c1861d44446269e2e7668b25ac83\": container with ID starting with 6bc33e00e91c1177fe2762cb5295c78f4b00c1861d44446269e2e7668b25ac83 not found: ID does not exist" containerID="6bc33e00e91c1177fe2762cb5295c78f4b00c1861d44446269e2e7668b25ac83" Oct 02 13:09:30 crc kubenswrapper[4724]: I1002 13:09:30.874201 4724 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6bc33e00e91c1177fe2762cb5295c78f4b00c1861d44446269e2e7668b25ac83"} err="failed to get container status \"6bc33e00e91c1177fe2762cb5295c78f4b00c1861d44446269e2e7668b25ac83\": rpc error: code = NotFound desc = could not find container \"6bc33e00e91c1177fe2762cb5295c78f4b00c1861d44446269e2e7668b25ac83\": container with ID starting with 6bc33e00e91c1177fe2762cb5295c78f4b00c1861d44446269e2e7668b25ac83 not found: ID does not exist" Oct 02 13:09:30 crc kubenswrapper[4724]: I1002 13:09:30.874223 4724 scope.go:117] "RemoveContainer" containerID="b3940d1762b165816101228f10689c4897ca3909b295c51368935e2d28d32e9c" Oct 02 13:09:30 crc kubenswrapper[4724]: E1002 13:09:30.874633 4724 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b3940d1762b165816101228f10689c4897ca3909b295c51368935e2d28d32e9c\": container with ID starting with b3940d1762b165816101228f10689c4897ca3909b295c51368935e2d28d32e9c not found: ID does not exist" containerID="b3940d1762b165816101228f10689c4897ca3909b295c51368935e2d28d32e9c" Oct 02 13:09:30 crc kubenswrapper[4724]: I1002 13:09:30.874687 4724 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b3940d1762b165816101228f10689c4897ca3909b295c51368935e2d28d32e9c"} err="failed to get container status \"b3940d1762b165816101228f10689c4897ca3909b295c51368935e2d28d32e9c\": rpc error: code = NotFound desc = could not find container \"b3940d1762b165816101228f10689c4897ca3909b295c51368935e2d28d32e9c\": container with ID starting with b3940d1762b165816101228f10689c4897ca3909b295c51368935e2d28d32e9c not found: ID does not exist" Oct 02 13:09:30 crc kubenswrapper[4724]: I1002 13:09:30.874718 4724 scope.go:117] "RemoveContainer" containerID="197d101ac60dc4e22bd91f0bccaf0c5d9461d2bb94425b5c6d4ca75aad0f7e25" Oct 02 13:09:30 crc kubenswrapper[4724]: E1002 13:09:30.874979 4724 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"197d101ac60dc4e22bd91f0bccaf0c5d9461d2bb94425b5c6d4ca75aad0f7e25\": container with ID starting with 197d101ac60dc4e22bd91f0bccaf0c5d9461d2bb94425b5c6d4ca75aad0f7e25 not found: ID does not exist" containerID="197d101ac60dc4e22bd91f0bccaf0c5d9461d2bb94425b5c6d4ca75aad0f7e25" Oct 02 13:09:30 crc kubenswrapper[4724]: I1002 13:09:30.875005 4724 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"197d101ac60dc4e22bd91f0bccaf0c5d9461d2bb94425b5c6d4ca75aad0f7e25"} err="failed to get container status \"197d101ac60dc4e22bd91f0bccaf0c5d9461d2bb94425b5c6d4ca75aad0f7e25\": rpc error: code = NotFound desc = could not find container \"197d101ac60dc4e22bd91f0bccaf0c5d9461d2bb94425b5c6d4ca75aad0f7e25\": container with ID starting with 197d101ac60dc4e22bd91f0bccaf0c5d9461d2bb94425b5c6d4ca75aad0f7e25 not found: ID does not exist" Oct 02 13:09:30 crc kubenswrapper[4724]: I1002 13:09:30.875024 4724 scope.go:117] "RemoveContainer" containerID="1f4412d5281a9c03ea5b251d17371e363b9c5a970bad0db12adf3f75ddd1f955" Oct 02 13:09:30 crc kubenswrapper[4724]: E1002 13:09:30.875222 4724 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1f4412d5281a9c03ea5b251d17371e363b9c5a970bad0db12adf3f75ddd1f955\": container with ID starting with 1f4412d5281a9c03ea5b251d17371e363b9c5a970bad0db12adf3f75ddd1f955 not found: ID does not exist" containerID="1f4412d5281a9c03ea5b251d17371e363b9c5a970bad0db12adf3f75ddd1f955" Oct 02 13:09:30 crc kubenswrapper[4724]: I1002 13:09:30.875255 4724 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1f4412d5281a9c03ea5b251d17371e363b9c5a970bad0db12adf3f75ddd1f955"} err="failed to get container status \"1f4412d5281a9c03ea5b251d17371e363b9c5a970bad0db12adf3f75ddd1f955\": rpc error: code = NotFound desc = could not find container \"1f4412d5281a9c03ea5b251d17371e363b9c5a970bad0db12adf3f75ddd1f955\": container with ID starting with 1f4412d5281a9c03ea5b251d17371e363b9c5a970bad0db12adf3f75ddd1f955 not found: ID does not exist" Oct 02 13:09:30 crc kubenswrapper[4724]: I1002 13:09:30.875278 4724 scope.go:117] "RemoveContainer" containerID="4186fbd2d282edbe581a4d8d3616dff86c3e6cdbe797999fa57f6034870e7742" Oct 02 13:09:30 crc kubenswrapper[4724]: E1002 13:09:30.875485 4724 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4186fbd2d282edbe581a4d8d3616dff86c3e6cdbe797999fa57f6034870e7742\": container with ID starting with 4186fbd2d282edbe581a4d8d3616dff86c3e6cdbe797999fa57f6034870e7742 not found: ID does not exist" containerID="4186fbd2d282edbe581a4d8d3616dff86c3e6cdbe797999fa57f6034870e7742" Oct 02 13:09:30 crc kubenswrapper[4724]: I1002 13:09:30.875514 4724 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4186fbd2d282edbe581a4d8d3616dff86c3e6cdbe797999fa57f6034870e7742"} err="failed to get container status \"4186fbd2d282edbe581a4d8d3616dff86c3e6cdbe797999fa57f6034870e7742\": rpc error: code = NotFound desc = could not find container \"4186fbd2d282edbe581a4d8d3616dff86c3e6cdbe797999fa57f6034870e7742\": container with ID starting with 4186fbd2d282edbe581a4d8d3616dff86c3e6cdbe797999fa57f6034870e7742 not found: ID does not exist" Oct 02 13:09:30 crc kubenswrapper[4724]: I1002 13:09:30.875533 4724 scope.go:117] "RemoveContainer" containerID="9c7841b20007f0436754e669fb6f1a85de1f0ced1c0b2686ae0a20b56ab878f2" Oct 02 13:09:30 crc kubenswrapper[4724]: E1002 13:09:30.876001 4724 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9c7841b20007f0436754e669fb6f1a85de1f0ced1c0b2686ae0a20b56ab878f2\": container with ID starting with 9c7841b20007f0436754e669fb6f1a85de1f0ced1c0b2686ae0a20b56ab878f2 not found: ID does not exist" containerID="9c7841b20007f0436754e669fb6f1a85de1f0ced1c0b2686ae0a20b56ab878f2" Oct 02 13:09:30 crc kubenswrapper[4724]: I1002 13:09:30.876033 4724 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9c7841b20007f0436754e669fb6f1a85de1f0ced1c0b2686ae0a20b56ab878f2"} err="failed to get container status \"9c7841b20007f0436754e669fb6f1a85de1f0ced1c0b2686ae0a20b56ab878f2\": rpc error: code = NotFound desc = could not find container \"9c7841b20007f0436754e669fb6f1a85de1f0ced1c0b2686ae0a20b56ab878f2\": container with ID starting with 9c7841b20007f0436754e669fb6f1a85de1f0ced1c0b2686ae0a20b56ab878f2 not found: ID does not exist" Oct 02 13:09:30 crc kubenswrapper[4724]: I1002 13:09:30.876051 4724 scope.go:117] "RemoveContainer" containerID="8e820afb07a66f1eb4b72ab3a92ba9dbd09e42cf1a2994c9dbca241a450e51fb" Oct 02 13:09:30 crc kubenswrapper[4724]: I1002 13:09:30.876262 4724 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8e820afb07a66f1eb4b72ab3a92ba9dbd09e42cf1a2994c9dbca241a450e51fb"} err="failed to get container status \"8e820afb07a66f1eb4b72ab3a92ba9dbd09e42cf1a2994c9dbca241a450e51fb\": rpc error: code = NotFound desc = could not find container \"8e820afb07a66f1eb4b72ab3a92ba9dbd09e42cf1a2994c9dbca241a450e51fb\": container with ID starting with 8e820afb07a66f1eb4b72ab3a92ba9dbd09e42cf1a2994c9dbca241a450e51fb not found: ID does not exist" Oct 02 13:09:30 crc kubenswrapper[4724]: I1002 13:09:30.876290 4724 scope.go:117] "RemoveContainer" containerID="3c6cd68ea9e76b6dcb1649fafb808953b30389fcaee674b611c655c239a36e7f" Oct 02 13:09:30 crc kubenswrapper[4724]: I1002 13:09:30.876564 4724 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3c6cd68ea9e76b6dcb1649fafb808953b30389fcaee674b611c655c239a36e7f"} err="failed to get container status \"3c6cd68ea9e76b6dcb1649fafb808953b30389fcaee674b611c655c239a36e7f\": rpc error: code = NotFound desc = could not find container \"3c6cd68ea9e76b6dcb1649fafb808953b30389fcaee674b611c655c239a36e7f\": container with ID starting with 3c6cd68ea9e76b6dcb1649fafb808953b30389fcaee674b611c655c239a36e7f not found: ID does not exist" Oct 02 13:09:30 crc kubenswrapper[4724]: I1002 13:09:30.876659 4724 scope.go:117] "RemoveContainer" containerID="300ea92580dcf51b685b378397b0dd067899d424f4394bce09a8242415acba64" Oct 02 13:09:30 crc kubenswrapper[4724]: I1002 13:09:30.876951 4724 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"300ea92580dcf51b685b378397b0dd067899d424f4394bce09a8242415acba64"} err="failed to get container status \"300ea92580dcf51b685b378397b0dd067899d424f4394bce09a8242415acba64\": rpc error: code = NotFound desc = could not find container \"300ea92580dcf51b685b378397b0dd067899d424f4394bce09a8242415acba64\": container with ID starting with 300ea92580dcf51b685b378397b0dd067899d424f4394bce09a8242415acba64 not found: ID does not exist" Oct 02 13:09:30 crc kubenswrapper[4724]: I1002 13:09:30.876976 4724 scope.go:117] "RemoveContainer" containerID="6bc33e00e91c1177fe2762cb5295c78f4b00c1861d44446269e2e7668b25ac83" Oct 02 13:09:30 crc kubenswrapper[4724]: I1002 13:09:30.877171 4724 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6bc33e00e91c1177fe2762cb5295c78f4b00c1861d44446269e2e7668b25ac83"} err="failed to get container status \"6bc33e00e91c1177fe2762cb5295c78f4b00c1861d44446269e2e7668b25ac83\": rpc error: code = NotFound desc = could not find container \"6bc33e00e91c1177fe2762cb5295c78f4b00c1861d44446269e2e7668b25ac83\": container with ID starting with 6bc33e00e91c1177fe2762cb5295c78f4b00c1861d44446269e2e7668b25ac83 not found: ID does not exist" Oct 02 13:09:30 crc kubenswrapper[4724]: I1002 13:09:30.877199 4724 scope.go:117] "RemoveContainer" containerID="b3940d1762b165816101228f10689c4897ca3909b295c51368935e2d28d32e9c" Oct 02 13:09:30 crc kubenswrapper[4724]: I1002 13:09:30.877494 4724 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b3940d1762b165816101228f10689c4897ca3909b295c51368935e2d28d32e9c"} err="failed to get container status \"b3940d1762b165816101228f10689c4897ca3909b295c51368935e2d28d32e9c\": rpc error: code = NotFound desc = could not find container \"b3940d1762b165816101228f10689c4897ca3909b295c51368935e2d28d32e9c\": container with ID starting with b3940d1762b165816101228f10689c4897ca3909b295c51368935e2d28d32e9c not found: ID does not exist" Oct 02 13:09:30 crc kubenswrapper[4724]: I1002 13:09:30.877514 4724 scope.go:117] "RemoveContainer" containerID="197d101ac60dc4e22bd91f0bccaf0c5d9461d2bb94425b5c6d4ca75aad0f7e25" Oct 02 13:09:30 crc kubenswrapper[4724]: I1002 13:09:30.877728 4724 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"197d101ac60dc4e22bd91f0bccaf0c5d9461d2bb94425b5c6d4ca75aad0f7e25"} err="failed to get container status \"197d101ac60dc4e22bd91f0bccaf0c5d9461d2bb94425b5c6d4ca75aad0f7e25\": rpc error: code = NotFound desc = could not find container \"197d101ac60dc4e22bd91f0bccaf0c5d9461d2bb94425b5c6d4ca75aad0f7e25\": container with ID starting with 197d101ac60dc4e22bd91f0bccaf0c5d9461d2bb94425b5c6d4ca75aad0f7e25 not found: ID does not exist" Oct 02 13:09:30 crc kubenswrapper[4724]: I1002 13:09:30.877752 4724 scope.go:117] "RemoveContainer" containerID="1f4412d5281a9c03ea5b251d17371e363b9c5a970bad0db12adf3f75ddd1f955" Oct 02 13:09:30 crc kubenswrapper[4724]: I1002 13:09:30.877979 4724 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1f4412d5281a9c03ea5b251d17371e363b9c5a970bad0db12adf3f75ddd1f955"} err="failed to get container status \"1f4412d5281a9c03ea5b251d17371e363b9c5a970bad0db12adf3f75ddd1f955\": rpc error: code = NotFound desc = could not find container \"1f4412d5281a9c03ea5b251d17371e363b9c5a970bad0db12adf3f75ddd1f955\": container with ID starting with 1f4412d5281a9c03ea5b251d17371e363b9c5a970bad0db12adf3f75ddd1f955 not found: ID does not exist" Oct 02 13:09:30 crc kubenswrapper[4724]: I1002 13:09:30.878022 4724 scope.go:117] "RemoveContainer" containerID="4186fbd2d282edbe581a4d8d3616dff86c3e6cdbe797999fa57f6034870e7742" Oct 02 13:09:30 crc kubenswrapper[4724]: I1002 13:09:30.878346 4724 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4186fbd2d282edbe581a4d8d3616dff86c3e6cdbe797999fa57f6034870e7742"} err="failed to get container status \"4186fbd2d282edbe581a4d8d3616dff86c3e6cdbe797999fa57f6034870e7742\": rpc error: code = NotFound desc = could not find container \"4186fbd2d282edbe581a4d8d3616dff86c3e6cdbe797999fa57f6034870e7742\": container with ID starting with 4186fbd2d282edbe581a4d8d3616dff86c3e6cdbe797999fa57f6034870e7742 not found: ID does not exist" Oct 02 13:09:30 crc kubenswrapper[4724]: I1002 13:09:30.878370 4724 scope.go:117] "RemoveContainer" containerID="9c7841b20007f0436754e669fb6f1a85de1f0ced1c0b2686ae0a20b56ab878f2" Oct 02 13:09:30 crc kubenswrapper[4724]: I1002 13:09:30.878733 4724 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9c7841b20007f0436754e669fb6f1a85de1f0ced1c0b2686ae0a20b56ab878f2"} err="failed to get container status \"9c7841b20007f0436754e669fb6f1a85de1f0ced1c0b2686ae0a20b56ab878f2\": rpc error: code = NotFound desc = could not find container \"9c7841b20007f0436754e669fb6f1a85de1f0ced1c0b2686ae0a20b56ab878f2\": container with ID starting with 9c7841b20007f0436754e669fb6f1a85de1f0ced1c0b2686ae0a20b56ab878f2 not found: ID does not exist" Oct 02 13:09:30 crc kubenswrapper[4724]: I1002 13:09:30.878762 4724 scope.go:117] "RemoveContainer" containerID="8e820afb07a66f1eb4b72ab3a92ba9dbd09e42cf1a2994c9dbca241a450e51fb" Oct 02 13:09:30 crc kubenswrapper[4724]: I1002 13:09:30.879147 4724 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8e820afb07a66f1eb4b72ab3a92ba9dbd09e42cf1a2994c9dbca241a450e51fb"} err="failed to get container status \"8e820afb07a66f1eb4b72ab3a92ba9dbd09e42cf1a2994c9dbca241a450e51fb\": rpc error: code = NotFound desc = could not find container \"8e820afb07a66f1eb4b72ab3a92ba9dbd09e42cf1a2994c9dbca241a450e51fb\": container with ID starting with 8e820afb07a66f1eb4b72ab3a92ba9dbd09e42cf1a2994c9dbca241a450e51fb not found: ID does not exist" Oct 02 13:09:30 crc kubenswrapper[4724]: I1002 13:09:30.879178 4724 scope.go:117] "RemoveContainer" containerID="3c6cd68ea9e76b6dcb1649fafb808953b30389fcaee674b611c655c239a36e7f" Oct 02 13:09:30 crc kubenswrapper[4724]: I1002 13:09:30.879452 4724 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3c6cd68ea9e76b6dcb1649fafb808953b30389fcaee674b611c655c239a36e7f"} err="failed to get container status \"3c6cd68ea9e76b6dcb1649fafb808953b30389fcaee674b611c655c239a36e7f\": rpc error: code = NotFound desc = could not find container \"3c6cd68ea9e76b6dcb1649fafb808953b30389fcaee674b611c655c239a36e7f\": container with ID starting with 3c6cd68ea9e76b6dcb1649fafb808953b30389fcaee674b611c655c239a36e7f not found: ID does not exist" Oct 02 13:09:30 crc kubenswrapper[4724]: I1002 13:09:30.879474 4724 scope.go:117] "RemoveContainer" containerID="300ea92580dcf51b685b378397b0dd067899d424f4394bce09a8242415acba64" Oct 02 13:09:30 crc kubenswrapper[4724]: I1002 13:09:30.879999 4724 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"300ea92580dcf51b685b378397b0dd067899d424f4394bce09a8242415acba64"} err="failed to get container status \"300ea92580dcf51b685b378397b0dd067899d424f4394bce09a8242415acba64\": rpc error: code = NotFound desc = could not find container \"300ea92580dcf51b685b378397b0dd067899d424f4394bce09a8242415acba64\": container with ID starting with 300ea92580dcf51b685b378397b0dd067899d424f4394bce09a8242415acba64 not found: ID does not exist" Oct 02 13:09:30 crc kubenswrapper[4724]: I1002 13:09:30.880017 4724 scope.go:117] "RemoveContainer" containerID="6bc33e00e91c1177fe2762cb5295c78f4b00c1861d44446269e2e7668b25ac83" Oct 02 13:09:30 crc kubenswrapper[4724]: I1002 13:09:30.880248 4724 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6bc33e00e91c1177fe2762cb5295c78f4b00c1861d44446269e2e7668b25ac83"} err="failed to get container status \"6bc33e00e91c1177fe2762cb5295c78f4b00c1861d44446269e2e7668b25ac83\": rpc error: code = NotFound desc = could not find container \"6bc33e00e91c1177fe2762cb5295c78f4b00c1861d44446269e2e7668b25ac83\": container with ID starting with 6bc33e00e91c1177fe2762cb5295c78f4b00c1861d44446269e2e7668b25ac83 not found: ID does not exist" Oct 02 13:09:30 crc kubenswrapper[4724]: I1002 13:09:30.880265 4724 scope.go:117] "RemoveContainer" containerID="b3940d1762b165816101228f10689c4897ca3909b295c51368935e2d28d32e9c" Oct 02 13:09:30 crc kubenswrapper[4724]: I1002 13:09:30.880466 4724 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b3940d1762b165816101228f10689c4897ca3909b295c51368935e2d28d32e9c"} err="failed to get container status \"b3940d1762b165816101228f10689c4897ca3909b295c51368935e2d28d32e9c\": rpc error: code = NotFound desc = could not find container \"b3940d1762b165816101228f10689c4897ca3909b295c51368935e2d28d32e9c\": container with ID starting with b3940d1762b165816101228f10689c4897ca3909b295c51368935e2d28d32e9c not found: ID does not exist" Oct 02 13:09:30 crc kubenswrapper[4724]: I1002 13:09:30.880484 4724 scope.go:117] "RemoveContainer" containerID="197d101ac60dc4e22bd91f0bccaf0c5d9461d2bb94425b5c6d4ca75aad0f7e25" Oct 02 13:09:30 crc kubenswrapper[4724]: I1002 13:09:30.880732 4724 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"197d101ac60dc4e22bd91f0bccaf0c5d9461d2bb94425b5c6d4ca75aad0f7e25"} err="failed to get container status \"197d101ac60dc4e22bd91f0bccaf0c5d9461d2bb94425b5c6d4ca75aad0f7e25\": rpc error: code = NotFound desc = could not find container \"197d101ac60dc4e22bd91f0bccaf0c5d9461d2bb94425b5c6d4ca75aad0f7e25\": container with ID starting with 197d101ac60dc4e22bd91f0bccaf0c5d9461d2bb94425b5c6d4ca75aad0f7e25 not found: ID does not exist" Oct 02 13:09:30 crc kubenswrapper[4724]: I1002 13:09:30.880760 4724 scope.go:117] "RemoveContainer" containerID="1f4412d5281a9c03ea5b251d17371e363b9c5a970bad0db12adf3f75ddd1f955" Oct 02 13:09:30 crc kubenswrapper[4724]: I1002 13:09:30.881000 4724 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1f4412d5281a9c03ea5b251d17371e363b9c5a970bad0db12adf3f75ddd1f955"} err="failed to get container status \"1f4412d5281a9c03ea5b251d17371e363b9c5a970bad0db12adf3f75ddd1f955\": rpc error: code = NotFound desc = could not find container \"1f4412d5281a9c03ea5b251d17371e363b9c5a970bad0db12adf3f75ddd1f955\": container with ID starting with 1f4412d5281a9c03ea5b251d17371e363b9c5a970bad0db12adf3f75ddd1f955 not found: ID does not exist" Oct 02 13:09:30 crc kubenswrapper[4724]: I1002 13:09:30.881021 4724 scope.go:117] "RemoveContainer" containerID="4186fbd2d282edbe581a4d8d3616dff86c3e6cdbe797999fa57f6034870e7742" Oct 02 13:09:30 crc kubenswrapper[4724]: I1002 13:09:30.881244 4724 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4186fbd2d282edbe581a4d8d3616dff86c3e6cdbe797999fa57f6034870e7742"} err="failed to get container status \"4186fbd2d282edbe581a4d8d3616dff86c3e6cdbe797999fa57f6034870e7742\": rpc error: code = NotFound desc = could not find container \"4186fbd2d282edbe581a4d8d3616dff86c3e6cdbe797999fa57f6034870e7742\": container with ID starting with 4186fbd2d282edbe581a4d8d3616dff86c3e6cdbe797999fa57f6034870e7742 not found: ID does not exist" Oct 02 13:09:30 crc kubenswrapper[4724]: I1002 13:09:30.881271 4724 scope.go:117] "RemoveContainer" containerID="9c7841b20007f0436754e669fb6f1a85de1f0ced1c0b2686ae0a20b56ab878f2" Oct 02 13:09:30 crc kubenswrapper[4724]: I1002 13:09:30.881502 4724 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9c7841b20007f0436754e669fb6f1a85de1f0ced1c0b2686ae0a20b56ab878f2"} err="failed to get container status \"9c7841b20007f0436754e669fb6f1a85de1f0ced1c0b2686ae0a20b56ab878f2\": rpc error: code = NotFound desc = could not find container \"9c7841b20007f0436754e669fb6f1a85de1f0ced1c0b2686ae0a20b56ab878f2\": container with ID starting with 9c7841b20007f0436754e669fb6f1a85de1f0ced1c0b2686ae0a20b56ab878f2 not found: ID does not exist" Oct 02 13:09:31 crc kubenswrapper[4724]: I1002 13:09:31.718335 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jvbkw" event={"ID":"252e3e73-b46f-414a-96f7-c264bcd63b32","Type":"ContainerStarted","Data":"f9ce5542775f715e2d20fe31c6544146045c5a05d5a9d13d52869ab40196edc2"} Oct 02 13:09:31 crc kubenswrapper[4724]: I1002 13:09:31.718734 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jvbkw" event={"ID":"252e3e73-b46f-414a-96f7-c264bcd63b32","Type":"ContainerStarted","Data":"51b6fb8ce60ca211a3e4de7d01ac4b9abd21a736a75fb004ee73b7a1de33549d"} Oct 02 13:09:31 crc kubenswrapper[4724]: I1002 13:09:31.718750 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jvbkw" event={"ID":"252e3e73-b46f-414a-96f7-c264bcd63b32","Type":"ContainerStarted","Data":"6ab97bb97628e2297a025c0382ae099e3f78d53b932017b3aaa0f124a43e391c"} Oct 02 13:09:31 crc kubenswrapper[4724]: I1002 13:09:31.718764 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jvbkw" event={"ID":"252e3e73-b46f-414a-96f7-c264bcd63b32","Type":"ContainerStarted","Data":"86406171ecafbddcf9ae70fbd2338acc977d445f3b5e8ed636d3d23724a596ed"} Oct 02 13:09:31 crc kubenswrapper[4724]: I1002 13:09:31.718777 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jvbkw" event={"ID":"252e3e73-b46f-414a-96f7-c264bcd63b32","Type":"ContainerStarted","Data":"6d4fcb81b44cdbcfbbee6c1e4849f100497c8751421f92bfbde4d6473ea0ca22"} Oct 02 13:09:31 crc kubenswrapper[4724]: I1002 13:09:31.718788 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jvbkw" event={"ID":"252e3e73-b46f-414a-96f7-c264bcd63b32","Type":"ContainerStarted","Data":"8c7352237ac43b37bc105221d7e9aa81447edd235731f4789c12a86fb2a6285f"} Oct 02 13:09:32 crc kubenswrapper[4724]: I1002 13:09:32.318845 4724 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4089ad23-969c-4222-a8ed-e141ec291e80" path="/var/lib/kubelet/pods/4089ad23-969c-4222-a8ed-e141ec291e80/volumes" Oct 02 13:09:33 crc kubenswrapper[4724]: I1002 13:09:33.740929 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jvbkw" event={"ID":"252e3e73-b46f-414a-96f7-c264bcd63b32","Type":"ContainerStarted","Data":"5fb2bd6187574db9bac0cec2d3d7f34b24aeb3dd33af11cced84355ca74bd6f6"} Oct 02 13:09:34 crc kubenswrapper[4724]: I1002 13:09:34.734697 4724 patch_prober.go:28] interesting pod/machine-config-daemon-74k4t container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 02 13:09:34 crc kubenswrapper[4724]: I1002 13:09:34.735238 4724 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-74k4t" podUID="f6090eaa-c182-4788-950c-16352c271233" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 02 13:09:34 crc kubenswrapper[4724]: I1002 13:09:34.735324 4724 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-74k4t" Oct 02 13:09:34 crc kubenswrapper[4724]: I1002 13:09:34.736199 4724 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"d398edf87fc8194d9c33ea4b32752e88bb2fb64c569afeb27d5aad73c63d4d57"} pod="openshift-machine-config-operator/machine-config-daemon-74k4t" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 02 13:09:34 crc kubenswrapper[4724]: I1002 13:09:34.736289 4724 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-74k4t" podUID="f6090eaa-c182-4788-950c-16352c271233" containerName="machine-config-daemon" containerID="cri-o://d398edf87fc8194d9c33ea4b32752e88bb2fb64c569afeb27d5aad73c63d4d57" gracePeriod=600 Oct 02 13:09:35 crc kubenswrapper[4724]: I1002 13:09:35.756927 4724 generic.go:334] "Generic (PLEG): container finished" podID="f6090eaa-c182-4788-950c-16352c271233" containerID="d398edf87fc8194d9c33ea4b32752e88bb2fb64c569afeb27d5aad73c63d4d57" exitCode=0 Oct 02 13:09:35 crc kubenswrapper[4724]: I1002 13:09:35.757003 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-74k4t" event={"ID":"f6090eaa-c182-4788-950c-16352c271233","Type":"ContainerDied","Data":"d398edf87fc8194d9c33ea4b32752e88bb2fb64c569afeb27d5aad73c63d4d57"} Oct 02 13:09:35 crc kubenswrapper[4724]: I1002 13:09:35.757183 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-74k4t" event={"ID":"f6090eaa-c182-4788-950c-16352c271233","Type":"ContainerStarted","Data":"dec5500d23c7c01c852c0e2b0478c7abcfcd5e96c2408d1a5b49547642fc64d1"} Oct 02 13:09:35 crc kubenswrapper[4724]: I1002 13:09:35.757208 4724 scope.go:117] "RemoveContainer" containerID="0e1e59e5ee7d2a3679bcc3637d0a4c2bf504931cc1e07c9c6a217c9a76b76895" Oct 02 13:09:36 crc kubenswrapper[4724]: I1002 13:09:36.767773 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jvbkw" event={"ID":"252e3e73-b46f-414a-96f7-c264bcd63b32","Type":"ContainerStarted","Data":"7ee83854b24551a1ebbc110b6b0d1eeda45a09a92d1fd3ecbb1554bc194b7785"} Oct 02 13:09:36 crc kubenswrapper[4724]: I1002 13:09:36.768049 4724 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-jvbkw" Oct 02 13:09:36 crc kubenswrapper[4724]: I1002 13:09:36.768060 4724 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-jvbkw" Oct 02 13:09:36 crc kubenswrapper[4724]: I1002 13:09:36.768269 4724 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-jvbkw" Oct 02 13:09:36 crc kubenswrapper[4724]: I1002 13:09:36.797915 4724 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-jvbkw" Oct 02 13:09:36 crc kubenswrapper[4724]: I1002 13:09:36.801615 4724 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-jvbkw" Oct 02 13:09:36 crc kubenswrapper[4724]: I1002 13:09:36.803023 4724 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-jvbkw" podStartSLOduration=7.803011186 podStartE2EDuration="7.803011186s" podCreationTimestamp="2025-10-02 13:09:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 13:09:36.802916823 +0000 UTC m=+641.257675964" watchObservedRunningTime="2025-10-02 13:09:36.803011186 +0000 UTC m=+641.257770307" Oct 02 13:09:44 crc kubenswrapper[4724]: I1002 13:09:44.314216 4724 scope.go:117] "RemoveContainer" containerID="14f9a64b6c087079ffb4c1374976c3a597724a7ba274f00574e30df84ed84076" Oct 02 13:09:44 crc kubenswrapper[4724]: E1002 13:09:44.314992 4724 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-multus pod=multus-pr276_openshift-multus(c67e3c27-01fd-47f5-a7fb-a2ae1e7753d4)\"" pod="openshift-multus/multus-pr276" podUID="c67e3c27-01fd-47f5-a7fb-a2ae1e7753d4" Oct 02 13:09:52 crc kubenswrapper[4724]: I1002 13:09:52.774024 4724 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2gkbdx"] Oct 02 13:09:52 crc kubenswrapper[4724]: I1002 13:09:52.775680 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2gkbdx" Oct 02 13:09:52 crc kubenswrapper[4724]: I1002 13:09:52.777610 4724 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Oct 02 13:09:52 crc kubenswrapper[4724]: I1002 13:09:52.788224 4724 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2gkbdx"] Oct 02 13:09:52 crc kubenswrapper[4724]: I1002 13:09:52.876818 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/7c42221c-4802-40e9-b4ef-244e6e79d969-util\") pod \"8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2gkbdx\" (UID: \"7c42221c-4802-40e9-b4ef-244e6e79d969\") " pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2gkbdx" Oct 02 13:09:52 crc kubenswrapper[4724]: I1002 13:09:52.876887 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/7c42221c-4802-40e9-b4ef-244e6e79d969-bundle\") pod \"8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2gkbdx\" (UID: \"7c42221c-4802-40e9-b4ef-244e6e79d969\") " pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2gkbdx" Oct 02 13:09:52 crc kubenswrapper[4724]: I1002 13:09:52.876968 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2wp96\" (UniqueName: \"kubernetes.io/projected/7c42221c-4802-40e9-b4ef-244e6e79d969-kube-api-access-2wp96\") pod \"8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2gkbdx\" (UID: \"7c42221c-4802-40e9-b4ef-244e6e79d969\") " pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2gkbdx" Oct 02 13:09:52 crc kubenswrapper[4724]: I1002 13:09:52.978180 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/7c42221c-4802-40e9-b4ef-244e6e79d969-util\") pod \"8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2gkbdx\" (UID: \"7c42221c-4802-40e9-b4ef-244e6e79d969\") " pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2gkbdx" Oct 02 13:09:52 crc kubenswrapper[4724]: I1002 13:09:52.978581 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/7c42221c-4802-40e9-b4ef-244e6e79d969-bundle\") pod \"8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2gkbdx\" (UID: \"7c42221c-4802-40e9-b4ef-244e6e79d969\") " pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2gkbdx" Oct 02 13:09:52 crc kubenswrapper[4724]: I1002 13:09:52.978795 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2wp96\" (UniqueName: \"kubernetes.io/projected/7c42221c-4802-40e9-b4ef-244e6e79d969-kube-api-access-2wp96\") pod \"8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2gkbdx\" (UID: \"7c42221c-4802-40e9-b4ef-244e6e79d969\") " pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2gkbdx" Oct 02 13:09:52 crc kubenswrapper[4724]: I1002 13:09:52.978899 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/7c42221c-4802-40e9-b4ef-244e6e79d969-bundle\") pod \"8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2gkbdx\" (UID: \"7c42221c-4802-40e9-b4ef-244e6e79d969\") " pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2gkbdx" Oct 02 13:09:52 crc kubenswrapper[4724]: I1002 13:09:52.978833 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/7c42221c-4802-40e9-b4ef-244e6e79d969-util\") pod \"8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2gkbdx\" (UID: \"7c42221c-4802-40e9-b4ef-244e6e79d969\") " pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2gkbdx" Oct 02 13:09:52 crc kubenswrapper[4724]: I1002 13:09:52.999833 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2wp96\" (UniqueName: \"kubernetes.io/projected/7c42221c-4802-40e9-b4ef-244e6e79d969-kube-api-access-2wp96\") pod \"8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2gkbdx\" (UID: \"7c42221c-4802-40e9-b4ef-244e6e79d969\") " pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2gkbdx" Oct 02 13:09:53 crc kubenswrapper[4724]: I1002 13:09:53.097784 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2gkbdx" Oct 02 13:09:53 crc kubenswrapper[4724]: E1002 13:09:53.123414 4724 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2gkbdx_openshift-marketplace_7c42221c-4802-40e9-b4ef-244e6e79d969_0(170d602746546294d9c69aadce879c54a1f0161baeb54448aae981b19cbeca90): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Oct 02 13:09:53 crc kubenswrapper[4724]: E1002 13:09:53.123660 4724 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2gkbdx_openshift-marketplace_7c42221c-4802-40e9-b4ef-244e6e79d969_0(170d602746546294d9c69aadce879c54a1f0161baeb54448aae981b19cbeca90): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2gkbdx" Oct 02 13:09:53 crc kubenswrapper[4724]: E1002 13:09:53.123684 4724 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2gkbdx_openshift-marketplace_7c42221c-4802-40e9-b4ef-244e6e79d969_0(170d602746546294d9c69aadce879c54a1f0161baeb54448aae981b19cbeca90): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2gkbdx" Oct 02 13:09:53 crc kubenswrapper[4724]: E1002 13:09:53.123732 4724 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2gkbdx_openshift-marketplace(7c42221c-4802-40e9-b4ef-244e6e79d969)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2gkbdx_openshift-marketplace(7c42221c-4802-40e9-b4ef-244e6e79d969)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2gkbdx_openshift-marketplace_7c42221c-4802-40e9-b4ef-244e6e79d969_0(170d602746546294d9c69aadce879c54a1f0161baeb54448aae981b19cbeca90): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2gkbdx" podUID="7c42221c-4802-40e9-b4ef-244e6e79d969" Oct 02 13:09:53 crc kubenswrapper[4724]: I1002 13:09:53.903397 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2gkbdx" Oct 02 13:09:53 crc kubenswrapper[4724]: I1002 13:09:53.904087 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2gkbdx" Oct 02 13:09:53 crc kubenswrapper[4724]: E1002 13:09:53.926334 4724 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2gkbdx_openshift-marketplace_7c42221c-4802-40e9-b4ef-244e6e79d969_0(360126e75a2999a201c3e26e8ab5b77bfa51c05818dd17c62bb8eddc1c63a972): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Oct 02 13:09:53 crc kubenswrapper[4724]: E1002 13:09:53.926431 4724 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2gkbdx_openshift-marketplace_7c42221c-4802-40e9-b4ef-244e6e79d969_0(360126e75a2999a201c3e26e8ab5b77bfa51c05818dd17c62bb8eddc1c63a972): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2gkbdx" Oct 02 13:09:53 crc kubenswrapper[4724]: E1002 13:09:53.926465 4724 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2gkbdx_openshift-marketplace_7c42221c-4802-40e9-b4ef-244e6e79d969_0(360126e75a2999a201c3e26e8ab5b77bfa51c05818dd17c62bb8eddc1c63a972): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2gkbdx" Oct 02 13:09:53 crc kubenswrapper[4724]: E1002 13:09:53.926569 4724 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2gkbdx_openshift-marketplace(7c42221c-4802-40e9-b4ef-244e6e79d969)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2gkbdx_openshift-marketplace(7c42221c-4802-40e9-b4ef-244e6e79d969)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2gkbdx_openshift-marketplace_7c42221c-4802-40e9-b4ef-244e6e79d969_0(360126e75a2999a201c3e26e8ab5b77bfa51c05818dd17c62bb8eddc1c63a972): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2gkbdx" podUID="7c42221c-4802-40e9-b4ef-244e6e79d969" Oct 02 13:09:59 crc kubenswrapper[4724]: I1002 13:09:59.314290 4724 scope.go:117] "RemoveContainer" containerID="14f9a64b6c087079ffb4c1374976c3a597724a7ba274f00574e30df84ed84076" Oct 02 13:09:59 crc kubenswrapper[4724]: I1002 13:09:59.942419 4724 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-pr276_c67e3c27-01fd-47f5-a7fb-a2ae1e7753d4/kube-multus/2.log" Oct 02 13:09:59 crc kubenswrapper[4724]: I1002 13:09:59.942989 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-pr276" event={"ID":"c67e3c27-01fd-47f5-a7fb-a2ae1e7753d4","Type":"ContainerStarted","Data":"6490f13c7656601b627b00fbd9f9d014bedebef72612f538e64169d060b3dea2"} Oct 02 13:10:00 crc kubenswrapper[4724]: I1002 13:10:00.218763 4724 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-jvbkw" Oct 02 13:10:07 crc kubenswrapper[4724]: I1002 13:10:07.312849 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2gkbdx" Oct 02 13:10:07 crc kubenswrapper[4724]: I1002 13:10:07.313992 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2gkbdx" Oct 02 13:10:07 crc kubenswrapper[4724]: I1002 13:10:07.732941 4724 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2gkbdx"] Oct 02 13:10:07 crc kubenswrapper[4724]: W1002 13:10:07.740055 4724 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7c42221c_4802_40e9_b4ef_244e6e79d969.slice/crio-297bf5ba3895ba13e199f518e2d2e83dc64767637d896e263021d3dff45ec223 WatchSource:0}: Error finding container 297bf5ba3895ba13e199f518e2d2e83dc64767637d896e263021d3dff45ec223: Status 404 returned error can't find the container with id 297bf5ba3895ba13e199f518e2d2e83dc64767637d896e263021d3dff45ec223 Oct 02 13:10:07 crc kubenswrapper[4724]: I1002 13:10:07.996800 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2gkbdx" event={"ID":"7c42221c-4802-40e9-b4ef-244e6e79d969","Type":"ContainerStarted","Data":"d4ae6c54a35f071dacf32ea36e8b48572d2260ad52de45e7266efc92fc944057"} Oct 02 13:10:07 crc kubenswrapper[4724]: I1002 13:10:07.997335 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2gkbdx" event={"ID":"7c42221c-4802-40e9-b4ef-244e6e79d969","Type":"ContainerStarted","Data":"297bf5ba3895ba13e199f518e2d2e83dc64767637d896e263021d3dff45ec223"} Oct 02 13:10:09 crc kubenswrapper[4724]: I1002 13:10:09.007888 4724 generic.go:334] "Generic (PLEG): container finished" podID="7c42221c-4802-40e9-b4ef-244e6e79d969" containerID="d4ae6c54a35f071dacf32ea36e8b48572d2260ad52de45e7266efc92fc944057" exitCode=0 Oct 02 13:10:09 crc kubenswrapper[4724]: I1002 13:10:09.007985 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2gkbdx" event={"ID":"7c42221c-4802-40e9-b4ef-244e6e79d969","Type":"ContainerDied","Data":"d4ae6c54a35f071dacf32ea36e8b48572d2260ad52de45e7266efc92fc944057"} Oct 02 13:10:09 crc kubenswrapper[4724]: I1002 13:10:09.026008 4724 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 02 13:10:11 crc kubenswrapper[4724]: I1002 13:10:11.024164 4724 generic.go:334] "Generic (PLEG): container finished" podID="7c42221c-4802-40e9-b4ef-244e6e79d969" containerID="aa274a50f32a945f797daa5fcad3b39ef58493438a96b50fdb80a98d27cf8c86" exitCode=0 Oct 02 13:10:11 crc kubenswrapper[4724]: I1002 13:10:11.024266 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2gkbdx" event={"ID":"7c42221c-4802-40e9-b4ef-244e6e79d969","Type":"ContainerDied","Data":"aa274a50f32a945f797daa5fcad3b39ef58493438a96b50fdb80a98d27cf8c86"} Oct 02 13:10:12 crc kubenswrapper[4724]: I1002 13:10:12.039403 4724 generic.go:334] "Generic (PLEG): container finished" podID="7c42221c-4802-40e9-b4ef-244e6e79d969" containerID="f5d711f03e8eb08a776022a94e2aed976a8c48e7735bd111fef7ff870d35d95d" exitCode=0 Oct 02 13:10:12 crc kubenswrapper[4724]: I1002 13:10:12.039496 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2gkbdx" event={"ID":"7c42221c-4802-40e9-b4ef-244e6e79d969","Type":"ContainerDied","Data":"f5d711f03e8eb08a776022a94e2aed976a8c48e7735bd111fef7ff870d35d95d"} Oct 02 13:10:13 crc kubenswrapper[4724]: I1002 13:10:13.271309 4724 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2gkbdx" Oct 02 13:10:13 crc kubenswrapper[4724]: I1002 13:10:13.364908 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/7c42221c-4802-40e9-b4ef-244e6e79d969-bundle\") pod \"7c42221c-4802-40e9-b4ef-244e6e79d969\" (UID: \"7c42221c-4802-40e9-b4ef-244e6e79d969\") " Oct 02 13:10:13 crc kubenswrapper[4724]: I1002 13:10:13.365030 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/7c42221c-4802-40e9-b4ef-244e6e79d969-util\") pod \"7c42221c-4802-40e9-b4ef-244e6e79d969\" (UID: \"7c42221c-4802-40e9-b4ef-244e6e79d969\") " Oct 02 13:10:13 crc kubenswrapper[4724]: I1002 13:10:13.365107 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2wp96\" (UniqueName: \"kubernetes.io/projected/7c42221c-4802-40e9-b4ef-244e6e79d969-kube-api-access-2wp96\") pod \"7c42221c-4802-40e9-b4ef-244e6e79d969\" (UID: \"7c42221c-4802-40e9-b4ef-244e6e79d969\") " Oct 02 13:10:13 crc kubenswrapper[4724]: I1002 13:10:13.366992 4724 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7c42221c-4802-40e9-b4ef-244e6e79d969-bundle" (OuterVolumeSpecName: "bundle") pod "7c42221c-4802-40e9-b4ef-244e6e79d969" (UID: "7c42221c-4802-40e9-b4ef-244e6e79d969"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 13:10:13 crc kubenswrapper[4724]: I1002 13:10:13.372858 4724 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7c42221c-4802-40e9-b4ef-244e6e79d969-kube-api-access-2wp96" (OuterVolumeSpecName: "kube-api-access-2wp96") pod "7c42221c-4802-40e9-b4ef-244e6e79d969" (UID: "7c42221c-4802-40e9-b4ef-244e6e79d969"). InnerVolumeSpecName "kube-api-access-2wp96". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 13:10:13 crc kubenswrapper[4724]: I1002 13:10:13.467322 4724 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2wp96\" (UniqueName: \"kubernetes.io/projected/7c42221c-4802-40e9-b4ef-244e6e79d969-kube-api-access-2wp96\") on node \"crc\" DevicePath \"\"" Oct 02 13:10:13 crc kubenswrapper[4724]: I1002 13:10:13.467814 4724 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/7c42221c-4802-40e9-b4ef-244e6e79d969-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 13:10:13 crc kubenswrapper[4724]: I1002 13:10:13.515473 4724 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7c42221c-4802-40e9-b4ef-244e6e79d969-util" (OuterVolumeSpecName: "util") pod "7c42221c-4802-40e9-b4ef-244e6e79d969" (UID: "7c42221c-4802-40e9-b4ef-244e6e79d969"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 13:10:13 crc kubenswrapper[4724]: I1002 13:10:13.569843 4724 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/7c42221c-4802-40e9-b4ef-244e6e79d969-util\") on node \"crc\" DevicePath \"\"" Oct 02 13:10:14 crc kubenswrapper[4724]: I1002 13:10:14.052083 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2gkbdx" event={"ID":"7c42221c-4802-40e9-b4ef-244e6e79d969","Type":"ContainerDied","Data":"297bf5ba3895ba13e199f518e2d2e83dc64767637d896e263021d3dff45ec223"} Oct 02 13:10:14 crc kubenswrapper[4724]: I1002 13:10:14.052145 4724 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="297bf5ba3895ba13e199f518e2d2e83dc64767637d896e263021d3dff45ec223" Oct 02 13:10:14 crc kubenswrapper[4724]: I1002 13:10:14.052159 4724 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2gkbdx" Oct 02 13:10:25 crc kubenswrapper[4724]: I1002 13:10:25.918855 4724 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-controller-manager-54bb9cccbc-82d7b"] Oct 02 13:10:25 crc kubenswrapper[4724]: E1002 13:10:25.919634 4724 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7c42221c-4802-40e9-b4ef-244e6e79d969" containerName="pull" Oct 02 13:10:25 crc kubenswrapper[4724]: I1002 13:10:25.919651 4724 state_mem.go:107] "Deleted CPUSet assignment" podUID="7c42221c-4802-40e9-b4ef-244e6e79d969" containerName="pull" Oct 02 13:10:25 crc kubenswrapper[4724]: E1002 13:10:25.919662 4724 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7c42221c-4802-40e9-b4ef-244e6e79d969" containerName="extract" Oct 02 13:10:25 crc kubenswrapper[4724]: I1002 13:10:25.919669 4724 state_mem.go:107] "Deleted CPUSet assignment" podUID="7c42221c-4802-40e9-b4ef-244e6e79d969" containerName="extract" Oct 02 13:10:25 crc kubenswrapper[4724]: E1002 13:10:25.919677 4724 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7c42221c-4802-40e9-b4ef-244e6e79d969" containerName="util" Oct 02 13:10:25 crc kubenswrapper[4724]: I1002 13:10:25.919685 4724 state_mem.go:107] "Deleted CPUSet assignment" podUID="7c42221c-4802-40e9-b4ef-244e6e79d969" containerName="util" Oct 02 13:10:25 crc kubenswrapper[4724]: I1002 13:10:25.919790 4724 memory_manager.go:354] "RemoveStaleState removing state" podUID="7c42221c-4802-40e9-b4ef-244e6e79d969" containerName="extract" Oct 02 13:10:25 crc kubenswrapper[4724]: I1002 13:10:25.920232 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-54bb9cccbc-82d7b" Oct 02 13:10:25 crc kubenswrapper[4724]: I1002 13:10:25.922595 4724 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-controller-manager-service-cert" Oct 02 13:10:25 crc kubenswrapper[4724]: I1002 13:10:25.922632 4724 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"kube-root-ca.crt" Oct 02 13:10:25 crc kubenswrapper[4724]: I1002 13:10:25.923069 4724 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-cert" Oct 02 13:10:25 crc kubenswrapper[4724]: I1002 13:10:25.923129 4724 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"openshift-service-ca.crt" Oct 02 13:10:25 crc kubenswrapper[4724]: I1002 13:10:25.923818 4724 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"manager-account-dockercfg-vl28k" Oct 02 13:10:25 crc kubenswrapper[4724]: I1002 13:10:25.931304 4724 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-54bb9cccbc-82d7b"] Oct 02 13:10:26 crc kubenswrapper[4724]: I1002 13:10:26.027803 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x5x77\" (UniqueName: \"kubernetes.io/projected/9eb7bd6b-79d0-4b9d-876d-36b22622162d-kube-api-access-x5x77\") pod \"metallb-operator-controller-manager-54bb9cccbc-82d7b\" (UID: \"9eb7bd6b-79d0-4b9d-876d-36b22622162d\") " pod="metallb-system/metallb-operator-controller-manager-54bb9cccbc-82d7b" Oct 02 13:10:26 crc kubenswrapper[4724]: I1002 13:10:26.027854 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/9eb7bd6b-79d0-4b9d-876d-36b22622162d-webhook-cert\") pod \"metallb-operator-controller-manager-54bb9cccbc-82d7b\" (UID: \"9eb7bd6b-79d0-4b9d-876d-36b22622162d\") " pod="metallb-system/metallb-operator-controller-manager-54bb9cccbc-82d7b" Oct 02 13:10:26 crc kubenswrapper[4724]: I1002 13:10:26.027880 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/9eb7bd6b-79d0-4b9d-876d-36b22622162d-apiservice-cert\") pod \"metallb-operator-controller-manager-54bb9cccbc-82d7b\" (UID: \"9eb7bd6b-79d0-4b9d-876d-36b22622162d\") " pod="metallb-system/metallb-operator-controller-manager-54bb9cccbc-82d7b" Oct 02 13:10:26 crc kubenswrapper[4724]: I1002 13:10:26.128693 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/9eb7bd6b-79d0-4b9d-876d-36b22622162d-apiservice-cert\") pod \"metallb-operator-controller-manager-54bb9cccbc-82d7b\" (UID: \"9eb7bd6b-79d0-4b9d-876d-36b22622162d\") " pod="metallb-system/metallb-operator-controller-manager-54bb9cccbc-82d7b" Oct 02 13:10:26 crc kubenswrapper[4724]: I1002 13:10:26.128831 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x5x77\" (UniqueName: \"kubernetes.io/projected/9eb7bd6b-79d0-4b9d-876d-36b22622162d-kube-api-access-x5x77\") pod \"metallb-operator-controller-manager-54bb9cccbc-82d7b\" (UID: \"9eb7bd6b-79d0-4b9d-876d-36b22622162d\") " pod="metallb-system/metallb-operator-controller-manager-54bb9cccbc-82d7b" Oct 02 13:10:26 crc kubenswrapper[4724]: I1002 13:10:26.128859 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/9eb7bd6b-79d0-4b9d-876d-36b22622162d-webhook-cert\") pod \"metallb-operator-controller-manager-54bb9cccbc-82d7b\" (UID: \"9eb7bd6b-79d0-4b9d-876d-36b22622162d\") " pod="metallb-system/metallb-operator-controller-manager-54bb9cccbc-82d7b" Oct 02 13:10:26 crc kubenswrapper[4724]: I1002 13:10:26.146450 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/9eb7bd6b-79d0-4b9d-876d-36b22622162d-apiservice-cert\") pod \"metallb-operator-controller-manager-54bb9cccbc-82d7b\" (UID: \"9eb7bd6b-79d0-4b9d-876d-36b22622162d\") " pod="metallb-system/metallb-operator-controller-manager-54bb9cccbc-82d7b" Oct 02 13:10:26 crc kubenswrapper[4724]: I1002 13:10:26.146515 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/9eb7bd6b-79d0-4b9d-876d-36b22622162d-webhook-cert\") pod \"metallb-operator-controller-manager-54bb9cccbc-82d7b\" (UID: \"9eb7bd6b-79d0-4b9d-876d-36b22622162d\") " pod="metallb-system/metallb-operator-controller-manager-54bb9cccbc-82d7b" Oct 02 13:10:26 crc kubenswrapper[4724]: I1002 13:10:26.150844 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x5x77\" (UniqueName: \"kubernetes.io/projected/9eb7bd6b-79d0-4b9d-876d-36b22622162d-kube-api-access-x5x77\") pod \"metallb-operator-controller-manager-54bb9cccbc-82d7b\" (UID: \"9eb7bd6b-79d0-4b9d-876d-36b22622162d\") " pod="metallb-system/metallb-operator-controller-manager-54bb9cccbc-82d7b" Oct 02 13:10:26 crc kubenswrapper[4724]: I1002 13:10:26.238422 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-54bb9cccbc-82d7b" Oct 02 13:10:26 crc kubenswrapper[4724]: I1002 13:10:26.277496 4724 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-webhook-server-57f56bd847-ffkcc"] Oct 02 13:10:26 crc kubenswrapper[4724]: I1002 13:10:26.278198 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-57f56bd847-ffkcc" Oct 02 13:10:26 crc kubenswrapper[4724]: I1002 13:10:26.282965 4724 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-dockercfg-dr4xv" Oct 02 13:10:26 crc kubenswrapper[4724]: I1002 13:10:26.283268 4724 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Oct 02 13:10:26 crc kubenswrapper[4724]: I1002 13:10:26.283529 4724 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-service-cert" Oct 02 13:10:26 crc kubenswrapper[4724]: I1002 13:10:26.331428 4724 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-57f56bd847-ffkcc"] Oct 02 13:10:26 crc kubenswrapper[4724]: I1002 13:10:26.431568 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/803bc5d2-e4fe-49e8-abd0-737bd3ccc2f3-apiservice-cert\") pod \"metallb-operator-webhook-server-57f56bd847-ffkcc\" (UID: \"803bc5d2-e4fe-49e8-abd0-737bd3ccc2f3\") " pod="metallb-system/metallb-operator-webhook-server-57f56bd847-ffkcc" Oct 02 13:10:26 crc kubenswrapper[4724]: I1002 13:10:26.431667 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4swvj\" (UniqueName: \"kubernetes.io/projected/803bc5d2-e4fe-49e8-abd0-737bd3ccc2f3-kube-api-access-4swvj\") pod \"metallb-operator-webhook-server-57f56bd847-ffkcc\" (UID: \"803bc5d2-e4fe-49e8-abd0-737bd3ccc2f3\") " pod="metallb-system/metallb-operator-webhook-server-57f56bd847-ffkcc" Oct 02 13:10:26 crc kubenswrapper[4724]: I1002 13:10:26.431870 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/803bc5d2-e4fe-49e8-abd0-737bd3ccc2f3-webhook-cert\") pod \"metallb-operator-webhook-server-57f56bd847-ffkcc\" (UID: \"803bc5d2-e4fe-49e8-abd0-737bd3ccc2f3\") " pod="metallb-system/metallb-operator-webhook-server-57f56bd847-ffkcc" Oct 02 13:10:26 crc kubenswrapper[4724]: I1002 13:10:26.533142 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/803bc5d2-e4fe-49e8-abd0-737bd3ccc2f3-apiservice-cert\") pod \"metallb-operator-webhook-server-57f56bd847-ffkcc\" (UID: \"803bc5d2-e4fe-49e8-abd0-737bd3ccc2f3\") " pod="metallb-system/metallb-operator-webhook-server-57f56bd847-ffkcc" Oct 02 13:10:26 crc kubenswrapper[4724]: I1002 13:10:26.533991 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4swvj\" (UniqueName: \"kubernetes.io/projected/803bc5d2-e4fe-49e8-abd0-737bd3ccc2f3-kube-api-access-4swvj\") pod \"metallb-operator-webhook-server-57f56bd847-ffkcc\" (UID: \"803bc5d2-e4fe-49e8-abd0-737bd3ccc2f3\") " pod="metallb-system/metallb-operator-webhook-server-57f56bd847-ffkcc" Oct 02 13:10:26 crc kubenswrapper[4724]: I1002 13:10:26.534525 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/803bc5d2-e4fe-49e8-abd0-737bd3ccc2f3-webhook-cert\") pod \"metallb-operator-webhook-server-57f56bd847-ffkcc\" (UID: \"803bc5d2-e4fe-49e8-abd0-737bd3ccc2f3\") " pod="metallb-system/metallb-operator-webhook-server-57f56bd847-ffkcc" Oct 02 13:10:26 crc kubenswrapper[4724]: I1002 13:10:26.539188 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/803bc5d2-e4fe-49e8-abd0-737bd3ccc2f3-apiservice-cert\") pod \"metallb-operator-webhook-server-57f56bd847-ffkcc\" (UID: \"803bc5d2-e4fe-49e8-abd0-737bd3ccc2f3\") " pod="metallb-system/metallb-operator-webhook-server-57f56bd847-ffkcc" Oct 02 13:10:26 crc kubenswrapper[4724]: I1002 13:10:26.539270 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/803bc5d2-e4fe-49e8-abd0-737bd3ccc2f3-webhook-cert\") pod \"metallb-operator-webhook-server-57f56bd847-ffkcc\" (UID: \"803bc5d2-e4fe-49e8-abd0-737bd3ccc2f3\") " pod="metallb-system/metallb-operator-webhook-server-57f56bd847-ffkcc" Oct 02 13:10:26 crc kubenswrapper[4724]: I1002 13:10:26.560253 4724 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-54bb9cccbc-82d7b"] Oct 02 13:10:26 crc kubenswrapper[4724]: I1002 13:10:26.561091 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4swvj\" (UniqueName: \"kubernetes.io/projected/803bc5d2-e4fe-49e8-abd0-737bd3ccc2f3-kube-api-access-4swvj\") pod \"metallb-operator-webhook-server-57f56bd847-ffkcc\" (UID: \"803bc5d2-e4fe-49e8-abd0-737bd3ccc2f3\") " pod="metallb-system/metallb-operator-webhook-server-57f56bd847-ffkcc" Oct 02 13:10:26 crc kubenswrapper[4724]: I1002 13:10:26.641877 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-57f56bd847-ffkcc" Oct 02 13:10:26 crc kubenswrapper[4724]: I1002 13:10:26.861279 4724 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-57f56bd847-ffkcc"] Oct 02 13:10:27 crc kubenswrapper[4724]: I1002 13:10:27.132470 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-57f56bd847-ffkcc" event={"ID":"803bc5d2-e4fe-49e8-abd0-737bd3ccc2f3","Type":"ContainerStarted","Data":"3a8bb85b0052268151bfbce28e619e1c4c20ac6a9b8f9add26843d6faa42001a"} Oct 02 13:10:27 crc kubenswrapper[4724]: I1002 13:10:27.133723 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-54bb9cccbc-82d7b" event={"ID":"9eb7bd6b-79d0-4b9d-876d-36b22622162d","Type":"ContainerStarted","Data":"9ab0654c43251c68d97e14da82ebb4bbff7df8c5a03140871176907f646ebb45"} Oct 02 13:10:30 crc kubenswrapper[4724]: I1002 13:10:30.159579 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-54bb9cccbc-82d7b" event={"ID":"9eb7bd6b-79d0-4b9d-876d-36b22622162d","Type":"ContainerStarted","Data":"152d54bcb61d9550c911dcb464f059e23874a72f821342d7dac4c91674342919"} Oct 02 13:10:30 crc kubenswrapper[4724]: I1002 13:10:30.160233 4724 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-controller-manager-54bb9cccbc-82d7b" Oct 02 13:10:30 crc kubenswrapper[4724]: I1002 13:10:30.183649 4724 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-controller-manager-54bb9cccbc-82d7b" podStartSLOduration=2.436174154 podStartE2EDuration="5.183631025s" podCreationTimestamp="2025-10-02 13:10:25 +0000 UTC" firstStartedPulling="2025-10-02 13:10:26.574780034 +0000 UTC m=+691.029539155" lastFinishedPulling="2025-10-02 13:10:29.322236905 +0000 UTC m=+693.776996026" observedRunningTime="2025-10-02 13:10:30.182498003 +0000 UTC m=+694.637257124" watchObservedRunningTime="2025-10-02 13:10:30.183631025 +0000 UTC m=+694.638390146" Oct 02 13:10:32 crc kubenswrapper[4724]: I1002 13:10:32.171003 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-57f56bd847-ffkcc" event={"ID":"803bc5d2-e4fe-49e8-abd0-737bd3ccc2f3","Type":"ContainerStarted","Data":"0ce325d40d16fb0c5c8ffc87135ac3d89d8236b0ed45d39e4823691d52b4afd4"} Oct 02 13:10:32 crc kubenswrapper[4724]: I1002 13:10:32.171415 4724 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-webhook-server-57f56bd847-ffkcc" Oct 02 13:10:32 crc kubenswrapper[4724]: I1002 13:10:32.210753 4724 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-webhook-server-57f56bd847-ffkcc" podStartSLOduration=1.50386926 podStartE2EDuration="6.210731365s" podCreationTimestamp="2025-10-02 13:10:26 +0000 UTC" firstStartedPulling="2025-10-02 13:10:26.874001133 +0000 UTC m=+691.328760254" lastFinishedPulling="2025-10-02 13:10:31.580863228 +0000 UTC m=+696.035622359" observedRunningTime="2025-10-02 13:10:32.208633207 +0000 UTC m=+696.663392338" watchObservedRunningTime="2025-10-02 13:10:32.210731365 +0000 UTC m=+696.665490496" Oct 02 13:10:46 crc kubenswrapper[4724]: I1002 13:10:46.649994 4724 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-webhook-server-57f56bd847-ffkcc" Oct 02 13:11:06 crc kubenswrapper[4724]: I1002 13:11:06.241571 4724 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-controller-manager-54bb9cccbc-82d7b" Oct 02 13:11:06 crc kubenswrapper[4724]: I1002 13:11:06.928107 4724 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-hrpd4"] Oct 02 13:11:06 crc kubenswrapper[4724]: I1002 13:11:06.930322 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-hrpd4" Oct 02 13:11:06 crc kubenswrapper[4724]: I1002 13:11:06.936894 4724 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-certs-secret" Oct 02 13:11:06 crc kubenswrapper[4724]: I1002 13:11:06.936920 4724 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"frr-startup" Oct 02 13:11:06 crc kubenswrapper[4724]: I1002 13:11:06.937566 4724 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-daemon-dockercfg-9mrzh" Oct 02 13:11:06 crc kubenswrapper[4724]: I1002 13:11:06.966740 4724 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-webhook-server-64bf5d555-sg7xz"] Oct 02 13:11:06 crc kubenswrapper[4724]: I1002 13:11:06.967512 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-64bf5d555-sg7xz" Oct 02 13:11:06 crc kubenswrapper[4724]: I1002 13:11:06.969317 4724 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-webhook-server-cert" Oct 02 13:11:06 crc kubenswrapper[4724]: I1002 13:11:06.999149 4724 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-64bf5d555-sg7xz"] Oct 02 13:11:07 crc kubenswrapper[4724]: I1002 13:11:07.095918 4724 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/speaker-kvdgz"] Oct 02 13:11:07 crc kubenswrapper[4724]: I1002 13:11:07.096766 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-kvdgz" Oct 02 13:11:07 crc kubenswrapper[4724]: I1002 13:11:07.099236 4724 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-dockercfg-g5xtp" Oct 02 13:11:07 crc kubenswrapper[4724]: I1002 13:11:07.099444 4724 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-certs-secret" Oct 02 13:11:07 crc kubenswrapper[4724]: I1002 13:11:07.099504 4724 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"metallb-excludel2" Oct 02 13:11:07 crc kubenswrapper[4724]: I1002 13:11:07.099610 4724 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-memberlist" Oct 02 13:11:07 crc kubenswrapper[4724]: I1002 13:11:07.105680 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pdblm\" (UniqueName: \"kubernetes.io/projected/174f90c3-3227-45c0-b74f-541b539be8d5-kube-api-access-pdblm\") pod \"frr-k8s-webhook-server-64bf5d555-sg7xz\" (UID: \"174f90c3-3227-45c0-b74f-541b539be8d5\") " pod="metallb-system/frr-k8s-webhook-server-64bf5d555-sg7xz" Oct 02 13:11:07 crc kubenswrapper[4724]: I1002 13:11:07.105882 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1de6035f-4c39-40b4-af8b-24fed7520702-metrics-certs\") pod \"speaker-kvdgz\" (UID: \"1de6035f-4c39-40b4-af8b-24fed7520702\") " pod="metallb-system/speaker-kvdgz" Oct 02 13:11:07 crc kubenswrapper[4724]: I1002 13:11:07.105988 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6svsp\" (UniqueName: \"kubernetes.io/projected/4d5fd30f-d080-4759-b828-d40f2293c6c7-kube-api-access-6svsp\") pod \"frr-k8s-hrpd4\" (UID: \"4d5fd30f-d080-4759-b828-d40f2293c6c7\") " pod="metallb-system/frr-k8s-hrpd4" Oct 02 13:11:07 crc kubenswrapper[4724]: I1002 13:11:07.106093 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/4d5fd30f-d080-4759-b828-d40f2293c6c7-frr-conf\") pod \"frr-k8s-hrpd4\" (UID: \"4d5fd30f-d080-4759-b828-d40f2293c6c7\") " pod="metallb-system/frr-k8s-hrpd4" Oct 02 13:11:07 crc kubenswrapper[4724]: I1002 13:11:07.106174 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/4d5fd30f-d080-4759-b828-d40f2293c6c7-frr-startup\") pod \"frr-k8s-hrpd4\" (UID: \"4d5fd30f-d080-4759-b828-d40f2293c6c7\") " pod="metallb-system/frr-k8s-hrpd4" Oct 02 13:11:07 crc kubenswrapper[4724]: I1002 13:11:07.106293 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/4d5fd30f-d080-4759-b828-d40f2293c6c7-frr-sockets\") pod \"frr-k8s-hrpd4\" (UID: \"4d5fd30f-d080-4759-b828-d40f2293c6c7\") " pod="metallb-system/frr-k8s-hrpd4" Oct 02 13:11:07 crc kubenswrapper[4724]: I1002 13:11:07.106352 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/4d5fd30f-d080-4759-b828-d40f2293c6c7-metrics\") pod \"frr-k8s-hrpd4\" (UID: \"4d5fd30f-d080-4759-b828-d40f2293c6c7\") " pod="metallb-system/frr-k8s-hrpd4" Oct 02 13:11:07 crc kubenswrapper[4724]: I1002 13:11:07.106400 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/4d5fd30f-d080-4759-b828-d40f2293c6c7-reloader\") pod \"frr-k8s-hrpd4\" (UID: \"4d5fd30f-d080-4759-b828-d40f2293c6c7\") " pod="metallb-system/frr-k8s-hrpd4" Oct 02 13:11:07 crc kubenswrapper[4724]: I1002 13:11:07.106425 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/174f90c3-3227-45c0-b74f-541b539be8d5-cert\") pod \"frr-k8s-webhook-server-64bf5d555-sg7xz\" (UID: \"174f90c3-3227-45c0-b74f-541b539be8d5\") " pod="metallb-system/frr-k8s-webhook-server-64bf5d555-sg7xz" Oct 02 13:11:07 crc kubenswrapper[4724]: I1002 13:11:07.106444 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/1de6035f-4c39-40b4-af8b-24fed7520702-memberlist\") pod \"speaker-kvdgz\" (UID: \"1de6035f-4c39-40b4-af8b-24fed7520702\") " pod="metallb-system/speaker-kvdgz" Oct 02 13:11:07 crc kubenswrapper[4724]: I1002 13:11:07.106462 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wjb5k\" (UniqueName: \"kubernetes.io/projected/1de6035f-4c39-40b4-af8b-24fed7520702-kube-api-access-wjb5k\") pod \"speaker-kvdgz\" (UID: \"1de6035f-4c39-40b4-af8b-24fed7520702\") " pod="metallb-system/speaker-kvdgz" Oct 02 13:11:07 crc kubenswrapper[4724]: I1002 13:11:07.106478 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4d5fd30f-d080-4759-b828-d40f2293c6c7-metrics-certs\") pod \"frr-k8s-hrpd4\" (UID: \"4d5fd30f-d080-4759-b828-d40f2293c6c7\") " pod="metallb-system/frr-k8s-hrpd4" Oct 02 13:11:07 crc kubenswrapper[4724]: I1002 13:11:07.106495 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/1de6035f-4c39-40b4-af8b-24fed7520702-metallb-excludel2\") pod \"speaker-kvdgz\" (UID: \"1de6035f-4c39-40b4-af8b-24fed7520702\") " pod="metallb-system/speaker-kvdgz" Oct 02 13:11:07 crc kubenswrapper[4724]: I1002 13:11:07.126228 4724 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/controller-68d546b9d8-lcf4r"] Oct 02 13:11:07 crc kubenswrapper[4724]: I1002 13:11:07.127260 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-68d546b9d8-lcf4r" Oct 02 13:11:07 crc kubenswrapper[4724]: I1002 13:11:07.130254 4724 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-certs-secret" Oct 02 13:11:07 crc kubenswrapper[4724]: I1002 13:11:07.135019 4724 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-68d546b9d8-lcf4r"] Oct 02 13:11:07 crc kubenswrapper[4724]: I1002 13:11:07.207333 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/174f90c3-3227-45c0-b74f-541b539be8d5-cert\") pod \"frr-k8s-webhook-server-64bf5d555-sg7xz\" (UID: \"174f90c3-3227-45c0-b74f-541b539be8d5\") " pod="metallb-system/frr-k8s-webhook-server-64bf5d555-sg7xz" Oct 02 13:11:07 crc kubenswrapper[4724]: I1002 13:11:07.207408 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/1de6035f-4c39-40b4-af8b-24fed7520702-memberlist\") pod \"speaker-kvdgz\" (UID: \"1de6035f-4c39-40b4-af8b-24fed7520702\") " pod="metallb-system/speaker-kvdgz" Oct 02 13:11:07 crc kubenswrapper[4724]: I1002 13:11:07.207441 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4d5fd30f-d080-4759-b828-d40f2293c6c7-metrics-certs\") pod \"frr-k8s-hrpd4\" (UID: \"4d5fd30f-d080-4759-b828-d40f2293c6c7\") " pod="metallb-system/frr-k8s-hrpd4" Oct 02 13:11:07 crc kubenswrapper[4724]: I1002 13:11:07.207460 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wjb5k\" (UniqueName: \"kubernetes.io/projected/1de6035f-4c39-40b4-af8b-24fed7520702-kube-api-access-wjb5k\") pod \"speaker-kvdgz\" (UID: \"1de6035f-4c39-40b4-af8b-24fed7520702\") " pod="metallb-system/speaker-kvdgz" Oct 02 13:11:07 crc kubenswrapper[4724]: I1002 13:11:07.207494 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5fd4062f-1d62-422b-8190-b3392d13b74e-metrics-certs\") pod \"controller-68d546b9d8-lcf4r\" (UID: \"5fd4062f-1d62-422b-8190-b3392d13b74e\") " pod="metallb-system/controller-68d546b9d8-lcf4r" Oct 02 13:11:07 crc kubenswrapper[4724]: I1002 13:11:07.207514 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/1de6035f-4c39-40b4-af8b-24fed7520702-metallb-excludel2\") pod \"speaker-kvdgz\" (UID: \"1de6035f-4c39-40b4-af8b-24fed7520702\") " pod="metallb-system/speaker-kvdgz" Oct 02 13:11:07 crc kubenswrapper[4724]: I1002 13:11:07.207549 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1de6035f-4c39-40b4-af8b-24fed7520702-metrics-certs\") pod \"speaker-kvdgz\" (UID: \"1de6035f-4c39-40b4-af8b-24fed7520702\") " pod="metallb-system/speaker-kvdgz" Oct 02 13:11:07 crc kubenswrapper[4724]: I1002 13:11:07.207568 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pdblm\" (UniqueName: \"kubernetes.io/projected/174f90c3-3227-45c0-b74f-541b539be8d5-kube-api-access-pdblm\") pod \"frr-k8s-webhook-server-64bf5d555-sg7xz\" (UID: \"174f90c3-3227-45c0-b74f-541b539be8d5\") " pod="metallb-system/frr-k8s-webhook-server-64bf5d555-sg7xz" Oct 02 13:11:07 crc kubenswrapper[4724]: E1002 13:11:07.207593 4724 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Oct 02 13:11:07 crc kubenswrapper[4724]: E1002 13:11:07.207671 4724 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1de6035f-4c39-40b4-af8b-24fed7520702-memberlist podName:1de6035f-4c39-40b4-af8b-24fed7520702 nodeName:}" failed. No retries permitted until 2025-10-02 13:11:07.707653031 +0000 UTC m=+732.162412152 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/1de6035f-4c39-40b4-af8b-24fed7520702-memberlist") pod "speaker-kvdgz" (UID: "1de6035f-4c39-40b4-af8b-24fed7520702") : secret "metallb-memberlist" not found Oct 02 13:11:07 crc kubenswrapper[4724]: I1002 13:11:07.207599 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6svsp\" (UniqueName: \"kubernetes.io/projected/4d5fd30f-d080-4759-b828-d40f2293c6c7-kube-api-access-6svsp\") pod \"frr-k8s-hrpd4\" (UID: \"4d5fd30f-d080-4759-b828-d40f2293c6c7\") " pod="metallb-system/frr-k8s-hrpd4" Oct 02 13:11:07 crc kubenswrapper[4724]: I1002 13:11:07.207847 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/4d5fd30f-d080-4759-b828-d40f2293c6c7-frr-conf\") pod \"frr-k8s-hrpd4\" (UID: \"4d5fd30f-d080-4759-b828-d40f2293c6c7\") " pod="metallb-system/frr-k8s-hrpd4" Oct 02 13:11:07 crc kubenswrapper[4724]: I1002 13:11:07.207868 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/4d5fd30f-d080-4759-b828-d40f2293c6c7-frr-startup\") pod \"frr-k8s-hrpd4\" (UID: \"4d5fd30f-d080-4759-b828-d40f2293c6c7\") " pod="metallb-system/frr-k8s-hrpd4" Oct 02 13:11:07 crc kubenswrapper[4724]: I1002 13:11:07.207947 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-24dn2\" (UniqueName: \"kubernetes.io/projected/5fd4062f-1d62-422b-8190-b3392d13b74e-kube-api-access-24dn2\") pod \"controller-68d546b9d8-lcf4r\" (UID: \"5fd4062f-1d62-422b-8190-b3392d13b74e\") " pod="metallb-system/controller-68d546b9d8-lcf4r" Oct 02 13:11:07 crc kubenswrapper[4724]: I1002 13:11:07.207974 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/4d5fd30f-d080-4759-b828-d40f2293c6c7-frr-sockets\") pod \"frr-k8s-hrpd4\" (UID: \"4d5fd30f-d080-4759-b828-d40f2293c6c7\") " pod="metallb-system/frr-k8s-hrpd4" Oct 02 13:11:07 crc kubenswrapper[4724]: I1002 13:11:07.207992 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/4d5fd30f-d080-4759-b828-d40f2293c6c7-metrics\") pod \"frr-k8s-hrpd4\" (UID: \"4d5fd30f-d080-4759-b828-d40f2293c6c7\") " pod="metallb-system/frr-k8s-hrpd4" Oct 02 13:11:07 crc kubenswrapper[4724]: I1002 13:11:07.208059 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5fd4062f-1d62-422b-8190-b3392d13b74e-cert\") pod \"controller-68d546b9d8-lcf4r\" (UID: \"5fd4062f-1d62-422b-8190-b3392d13b74e\") " pod="metallb-system/controller-68d546b9d8-lcf4r" Oct 02 13:11:07 crc kubenswrapper[4724]: I1002 13:11:07.208079 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/4d5fd30f-d080-4759-b828-d40f2293c6c7-reloader\") pod \"frr-k8s-hrpd4\" (UID: \"4d5fd30f-d080-4759-b828-d40f2293c6c7\") " pod="metallb-system/frr-k8s-hrpd4" Oct 02 13:11:07 crc kubenswrapper[4724]: I1002 13:11:07.208599 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/4d5fd30f-d080-4759-b828-d40f2293c6c7-reloader\") pod \"frr-k8s-hrpd4\" (UID: \"4d5fd30f-d080-4759-b828-d40f2293c6c7\") " pod="metallb-system/frr-k8s-hrpd4" Oct 02 13:11:07 crc kubenswrapper[4724]: E1002 13:11:07.208714 4724 secret.go:188] Couldn't get secret metallb-system/speaker-certs-secret: secret "speaker-certs-secret" not found Oct 02 13:11:07 crc kubenswrapper[4724]: I1002 13:11:07.208819 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/4d5fd30f-d080-4759-b828-d40f2293c6c7-frr-conf\") pod \"frr-k8s-hrpd4\" (UID: \"4d5fd30f-d080-4759-b828-d40f2293c6c7\") " pod="metallb-system/frr-k8s-hrpd4" Oct 02 13:11:07 crc kubenswrapper[4724]: E1002 13:11:07.208932 4724 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1de6035f-4c39-40b4-af8b-24fed7520702-metrics-certs podName:1de6035f-4c39-40b4-af8b-24fed7520702 nodeName:}" failed. No retries permitted until 2025-10-02 13:11:07.708906096 +0000 UTC m=+732.163665287 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/1de6035f-4c39-40b4-af8b-24fed7520702-metrics-certs") pod "speaker-kvdgz" (UID: "1de6035f-4c39-40b4-af8b-24fed7520702") : secret "speaker-certs-secret" not found Oct 02 13:11:07 crc kubenswrapper[4724]: I1002 13:11:07.209023 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/1de6035f-4c39-40b4-af8b-24fed7520702-metallb-excludel2\") pod \"speaker-kvdgz\" (UID: \"1de6035f-4c39-40b4-af8b-24fed7520702\") " pod="metallb-system/speaker-kvdgz" Oct 02 13:11:07 crc kubenswrapper[4724]: I1002 13:11:07.209182 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/4d5fd30f-d080-4759-b828-d40f2293c6c7-frr-sockets\") pod \"frr-k8s-hrpd4\" (UID: \"4d5fd30f-d080-4759-b828-d40f2293c6c7\") " pod="metallb-system/frr-k8s-hrpd4" Oct 02 13:11:07 crc kubenswrapper[4724]: I1002 13:11:07.209575 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/4d5fd30f-d080-4759-b828-d40f2293c6c7-metrics\") pod \"frr-k8s-hrpd4\" (UID: \"4d5fd30f-d080-4759-b828-d40f2293c6c7\") " pod="metallb-system/frr-k8s-hrpd4" Oct 02 13:11:07 crc kubenswrapper[4724]: I1002 13:11:07.209650 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/4d5fd30f-d080-4759-b828-d40f2293c6c7-frr-startup\") pod \"frr-k8s-hrpd4\" (UID: \"4d5fd30f-d080-4759-b828-d40f2293c6c7\") " pod="metallb-system/frr-k8s-hrpd4" Oct 02 13:11:07 crc kubenswrapper[4724]: I1002 13:11:07.224209 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/174f90c3-3227-45c0-b74f-541b539be8d5-cert\") pod \"frr-k8s-webhook-server-64bf5d555-sg7xz\" (UID: \"174f90c3-3227-45c0-b74f-541b539be8d5\") " pod="metallb-system/frr-k8s-webhook-server-64bf5d555-sg7xz" Oct 02 13:11:07 crc kubenswrapper[4724]: I1002 13:11:07.224946 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4d5fd30f-d080-4759-b828-d40f2293c6c7-metrics-certs\") pod \"frr-k8s-hrpd4\" (UID: \"4d5fd30f-d080-4759-b828-d40f2293c6c7\") " pod="metallb-system/frr-k8s-hrpd4" Oct 02 13:11:07 crc kubenswrapper[4724]: I1002 13:11:07.225425 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wjb5k\" (UniqueName: \"kubernetes.io/projected/1de6035f-4c39-40b4-af8b-24fed7520702-kube-api-access-wjb5k\") pod \"speaker-kvdgz\" (UID: \"1de6035f-4c39-40b4-af8b-24fed7520702\") " pod="metallb-system/speaker-kvdgz" Oct 02 13:11:07 crc kubenswrapper[4724]: I1002 13:11:07.232630 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6svsp\" (UniqueName: \"kubernetes.io/projected/4d5fd30f-d080-4759-b828-d40f2293c6c7-kube-api-access-6svsp\") pod \"frr-k8s-hrpd4\" (UID: \"4d5fd30f-d080-4759-b828-d40f2293c6c7\") " pod="metallb-system/frr-k8s-hrpd4" Oct 02 13:11:07 crc kubenswrapper[4724]: I1002 13:11:07.232813 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pdblm\" (UniqueName: \"kubernetes.io/projected/174f90c3-3227-45c0-b74f-541b539be8d5-kube-api-access-pdblm\") pod \"frr-k8s-webhook-server-64bf5d555-sg7xz\" (UID: \"174f90c3-3227-45c0-b74f-541b539be8d5\") " pod="metallb-system/frr-k8s-webhook-server-64bf5d555-sg7xz" Oct 02 13:11:07 crc kubenswrapper[4724]: I1002 13:11:07.248573 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-hrpd4" Oct 02 13:11:07 crc kubenswrapper[4724]: I1002 13:11:07.284821 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-64bf5d555-sg7xz" Oct 02 13:11:07 crc kubenswrapper[4724]: I1002 13:11:07.308681 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5fd4062f-1d62-422b-8190-b3392d13b74e-metrics-certs\") pod \"controller-68d546b9d8-lcf4r\" (UID: \"5fd4062f-1d62-422b-8190-b3392d13b74e\") " pod="metallb-system/controller-68d546b9d8-lcf4r" Oct 02 13:11:07 crc kubenswrapper[4724]: I1002 13:11:07.308783 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-24dn2\" (UniqueName: \"kubernetes.io/projected/5fd4062f-1d62-422b-8190-b3392d13b74e-kube-api-access-24dn2\") pod \"controller-68d546b9d8-lcf4r\" (UID: \"5fd4062f-1d62-422b-8190-b3392d13b74e\") " pod="metallb-system/controller-68d546b9d8-lcf4r" Oct 02 13:11:07 crc kubenswrapper[4724]: I1002 13:11:07.308831 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5fd4062f-1d62-422b-8190-b3392d13b74e-cert\") pod \"controller-68d546b9d8-lcf4r\" (UID: \"5fd4062f-1d62-422b-8190-b3392d13b74e\") " pod="metallb-system/controller-68d546b9d8-lcf4r" Oct 02 13:11:07 crc kubenswrapper[4724]: I1002 13:11:07.310480 4724 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Oct 02 13:11:07 crc kubenswrapper[4724]: I1002 13:11:07.317415 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5fd4062f-1d62-422b-8190-b3392d13b74e-metrics-certs\") pod \"controller-68d546b9d8-lcf4r\" (UID: \"5fd4062f-1d62-422b-8190-b3392d13b74e\") " pod="metallb-system/controller-68d546b9d8-lcf4r" Oct 02 13:11:07 crc kubenswrapper[4724]: I1002 13:11:07.324154 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5fd4062f-1d62-422b-8190-b3392d13b74e-cert\") pod \"controller-68d546b9d8-lcf4r\" (UID: \"5fd4062f-1d62-422b-8190-b3392d13b74e\") " pod="metallb-system/controller-68d546b9d8-lcf4r" Oct 02 13:11:07 crc kubenswrapper[4724]: I1002 13:11:07.327895 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-24dn2\" (UniqueName: \"kubernetes.io/projected/5fd4062f-1d62-422b-8190-b3392d13b74e-kube-api-access-24dn2\") pod \"controller-68d546b9d8-lcf4r\" (UID: \"5fd4062f-1d62-422b-8190-b3392d13b74e\") " pod="metallb-system/controller-68d546b9d8-lcf4r" Oct 02 13:11:07 crc kubenswrapper[4724]: I1002 13:11:07.450075 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-68d546b9d8-lcf4r" Oct 02 13:11:07 crc kubenswrapper[4724]: I1002 13:11:07.504299 4724 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-64bf5d555-sg7xz"] Oct 02 13:11:07 crc kubenswrapper[4724]: W1002 13:11:07.514972 4724 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod174f90c3_3227_45c0_b74f_541b539be8d5.slice/crio-d8beb16c8121952ddfc05d27064f54e49f53d5feefa9ddd5f74d05411fb9e62a WatchSource:0}: Error finding container d8beb16c8121952ddfc05d27064f54e49f53d5feefa9ddd5f74d05411fb9e62a: Status 404 returned error can't find the container with id d8beb16c8121952ddfc05d27064f54e49f53d5feefa9ddd5f74d05411fb9e62a Oct 02 13:11:07 crc kubenswrapper[4724]: I1002 13:11:07.642863 4724 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-68d546b9d8-lcf4r"] Oct 02 13:11:07 crc kubenswrapper[4724]: I1002 13:11:07.713547 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/1de6035f-4c39-40b4-af8b-24fed7520702-memberlist\") pod \"speaker-kvdgz\" (UID: \"1de6035f-4c39-40b4-af8b-24fed7520702\") " pod="metallb-system/speaker-kvdgz" Oct 02 13:11:07 crc kubenswrapper[4724]: I1002 13:11:07.713606 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1de6035f-4c39-40b4-af8b-24fed7520702-metrics-certs\") pod \"speaker-kvdgz\" (UID: \"1de6035f-4c39-40b4-af8b-24fed7520702\") " pod="metallb-system/speaker-kvdgz" Oct 02 13:11:07 crc kubenswrapper[4724]: E1002 13:11:07.713645 4724 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Oct 02 13:11:07 crc kubenswrapper[4724]: E1002 13:11:07.713709 4724 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1de6035f-4c39-40b4-af8b-24fed7520702-memberlist podName:1de6035f-4c39-40b4-af8b-24fed7520702 nodeName:}" failed. No retries permitted until 2025-10-02 13:11:08.713692991 +0000 UTC m=+733.168452112 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/1de6035f-4c39-40b4-af8b-24fed7520702-memberlist") pod "speaker-kvdgz" (UID: "1de6035f-4c39-40b4-af8b-24fed7520702") : secret "metallb-memberlist" not found Oct 02 13:11:07 crc kubenswrapper[4724]: I1002 13:11:07.717344 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1de6035f-4c39-40b4-af8b-24fed7520702-metrics-certs\") pod \"speaker-kvdgz\" (UID: \"1de6035f-4c39-40b4-af8b-24fed7520702\") " pod="metallb-system/speaker-kvdgz" Oct 02 13:11:08 crc kubenswrapper[4724]: I1002 13:11:08.363137 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-64bf5d555-sg7xz" event={"ID":"174f90c3-3227-45c0-b74f-541b539be8d5","Type":"ContainerStarted","Data":"d8beb16c8121952ddfc05d27064f54e49f53d5feefa9ddd5f74d05411fb9e62a"} Oct 02 13:11:08 crc kubenswrapper[4724]: I1002 13:11:08.364988 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-68d546b9d8-lcf4r" event={"ID":"5fd4062f-1d62-422b-8190-b3392d13b74e","Type":"ContainerStarted","Data":"2f364f408e8e30fa8287efdbe06d9e52aa1093477756e5b380ca81d2c0f640d4"} Oct 02 13:11:08 crc kubenswrapper[4724]: I1002 13:11:08.726902 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/1de6035f-4c39-40b4-af8b-24fed7520702-memberlist\") pod \"speaker-kvdgz\" (UID: \"1de6035f-4c39-40b4-af8b-24fed7520702\") " pod="metallb-system/speaker-kvdgz" Oct 02 13:11:08 crc kubenswrapper[4724]: E1002 13:11:08.727050 4724 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Oct 02 13:11:08 crc kubenswrapper[4724]: E1002 13:11:08.727136 4724 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1de6035f-4c39-40b4-af8b-24fed7520702-memberlist podName:1de6035f-4c39-40b4-af8b-24fed7520702 nodeName:}" failed. No retries permitted until 2025-10-02 13:11:10.727117727 +0000 UTC m=+735.181876848 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/1de6035f-4c39-40b4-af8b-24fed7520702-memberlist") pod "speaker-kvdgz" (UID: "1de6035f-4c39-40b4-af8b-24fed7520702") : secret "metallb-memberlist" not found Oct 02 13:11:09 crc kubenswrapper[4724]: I1002 13:11:09.371342 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-hrpd4" event={"ID":"4d5fd30f-d080-4759-b828-d40f2293c6c7","Type":"ContainerStarted","Data":"9d116227b5ec1f77064d1ed97e4cf18737e9b1819379f06b32e8f59e1c0d5521"} Oct 02 13:11:09 crc kubenswrapper[4724]: I1002 13:11:09.376220 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-68d546b9d8-lcf4r" event={"ID":"5fd4062f-1d62-422b-8190-b3392d13b74e","Type":"ContainerStarted","Data":"8865023c6b5eb4c8b480d9b3b967144d602e522e7ce41a69dab6b7aef0a9da8c"} Oct 02 13:11:10 crc kubenswrapper[4724]: I1002 13:11:10.751018 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/1de6035f-4c39-40b4-af8b-24fed7520702-memberlist\") pod \"speaker-kvdgz\" (UID: \"1de6035f-4c39-40b4-af8b-24fed7520702\") " pod="metallb-system/speaker-kvdgz" Oct 02 13:11:10 crc kubenswrapper[4724]: I1002 13:11:10.758960 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/1de6035f-4c39-40b4-af8b-24fed7520702-memberlist\") pod \"speaker-kvdgz\" (UID: \"1de6035f-4c39-40b4-af8b-24fed7520702\") " pod="metallb-system/speaker-kvdgz" Oct 02 13:11:11 crc kubenswrapper[4724]: I1002 13:11:11.015353 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-kvdgz" Oct 02 13:11:11 crc kubenswrapper[4724]: W1002 13:11:11.075384 4724 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1de6035f_4c39_40b4_af8b_24fed7520702.slice/crio-db024543ba6b39e4da6736e159646a2286dc494d97c949f37fc8817dfd70e823 WatchSource:0}: Error finding container db024543ba6b39e4da6736e159646a2286dc494d97c949f37fc8817dfd70e823: Status 404 returned error can't find the container with id db024543ba6b39e4da6736e159646a2286dc494d97c949f37fc8817dfd70e823 Oct 02 13:11:11 crc kubenswrapper[4724]: I1002 13:11:11.392908 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-kvdgz" event={"ID":"1de6035f-4c39-40b4-af8b-24fed7520702","Type":"ContainerStarted","Data":"db024543ba6b39e4da6736e159646a2286dc494d97c949f37fc8817dfd70e823"} Oct 02 13:11:12 crc kubenswrapper[4724]: I1002 13:11:12.403347 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-kvdgz" event={"ID":"1de6035f-4c39-40b4-af8b-24fed7520702","Type":"ContainerStarted","Data":"e7cc9dc4b461aeddaf8365025a903c735434e7183db8d531c20b1025210697af"} Oct 02 13:11:25 crc kubenswrapper[4724]: I1002 13:11:25.468568 4724 generic.go:334] "Generic (PLEG): container finished" podID="4d5fd30f-d080-4759-b828-d40f2293c6c7" containerID="2e5d2122e35f9de9dd021f4c6e13df1ea4bee283c842ab7a0f0037f8652e9537" exitCode=0 Oct 02 13:11:25 crc kubenswrapper[4724]: I1002 13:11:25.468690 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-hrpd4" event={"ID":"4d5fd30f-d080-4759-b828-d40f2293c6c7","Type":"ContainerDied","Data":"2e5d2122e35f9de9dd021f4c6e13df1ea4bee283c842ab7a0f0037f8652e9537"} Oct 02 13:11:25 crc kubenswrapper[4724]: I1002 13:11:25.475676 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-68d546b9d8-lcf4r" event={"ID":"5fd4062f-1d62-422b-8190-b3392d13b74e","Type":"ContainerStarted","Data":"362f2cbe3f2a744f54a4944b58773e11db97bb79954aad3b412dd6226fc6f782"} Oct 02 13:11:25 crc kubenswrapper[4724]: I1002 13:11:25.476954 4724 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/controller-68d546b9d8-lcf4r" Oct 02 13:11:25 crc kubenswrapper[4724]: I1002 13:11:25.480795 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-64bf5d555-sg7xz" event={"ID":"174f90c3-3227-45c0-b74f-541b539be8d5","Type":"ContainerStarted","Data":"7d4116587a25259d37aa144c5073bd4d96a76a024610263f10102ab40f8236fc"} Oct 02 13:11:25 crc kubenswrapper[4724]: I1002 13:11:25.481123 4724 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-webhook-server-64bf5d555-sg7xz" Oct 02 13:11:25 crc kubenswrapper[4724]: I1002 13:11:25.482232 4724 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/controller-68d546b9d8-lcf4r" Oct 02 13:11:25 crc kubenswrapper[4724]: I1002 13:11:25.484067 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-kvdgz" event={"ID":"1de6035f-4c39-40b4-af8b-24fed7520702","Type":"ContainerStarted","Data":"bd9571d67ed790d2550645c8ef090071203b1abfbb95724cab6a70908672c102"} Oct 02 13:11:25 crc kubenswrapper[4724]: I1002 13:11:25.486181 4724 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/speaker-kvdgz" Oct 02 13:11:25 crc kubenswrapper[4724]: I1002 13:11:25.493608 4724 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/speaker-kvdgz" Oct 02 13:11:25 crc kubenswrapper[4724]: I1002 13:11:25.547437 4724 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/controller-68d546b9d8-lcf4r" podStartSLOduration=3.293017653 podStartE2EDuration="18.547408296s" podCreationTimestamp="2025-10-02 13:11:07 +0000 UTC" firstStartedPulling="2025-10-02 13:11:09.123943398 +0000 UTC m=+733.578702519" lastFinishedPulling="2025-10-02 13:11:24.378334041 +0000 UTC m=+748.833093162" observedRunningTime="2025-10-02 13:11:25.543576859 +0000 UTC m=+749.998335980" watchObservedRunningTime="2025-10-02 13:11:25.547408296 +0000 UTC m=+750.002167457" Oct 02 13:11:25 crc kubenswrapper[4724]: I1002 13:11:25.594512 4724 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/speaker-kvdgz" podStartSLOduration=5.922161991 podStartE2EDuration="18.594489125s" podCreationTimestamp="2025-10-02 13:11:07 +0000 UTC" firstStartedPulling="2025-10-02 13:11:11.898720433 +0000 UTC m=+736.353479554" lastFinishedPulling="2025-10-02 13:11:24.571047557 +0000 UTC m=+749.025806688" observedRunningTime="2025-10-02 13:11:25.571863061 +0000 UTC m=+750.026622192" watchObservedRunningTime="2025-10-02 13:11:25.594489125 +0000 UTC m=+750.049248246" Oct 02 13:11:26 crc kubenswrapper[4724]: I1002 13:11:26.492647 4724 generic.go:334] "Generic (PLEG): container finished" podID="4d5fd30f-d080-4759-b828-d40f2293c6c7" containerID="a085985d1388f5923ae486510abb1924a438f24b15e0fa5c2dcff78dddaabf38" exitCode=0 Oct 02 13:11:26 crc kubenswrapper[4724]: I1002 13:11:26.492754 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-hrpd4" event={"ID":"4d5fd30f-d080-4759-b828-d40f2293c6c7","Type":"ContainerDied","Data":"a085985d1388f5923ae486510abb1924a438f24b15e0fa5c2dcff78dddaabf38"} Oct 02 13:11:26 crc kubenswrapper[4724]: I1002 13:11:26.514891 4724 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-webhook-server-64bf5d555-sg7xz" podStartSLOduration=3.463422944 podStartE2EDuration="20.514875295s" podCreationTimestamp="2025-10-02 13:11:06 +0000 UTC" firstStartedPulling="2025-10-02 13:11:07.517213939 +0000 UTC m=+731.971973060" lastFinishedPulling="2025-10-02 13:11:24.56866628 +0000 UTC m=+749.023425411" observedRunningTime="2025-10-02 13:11:25.593889648 +0000 UTC m=+750.048648789" watchObservedRunningTime="2025-10-02 13:11:26.514875295 +0000 UTC m=+750.969634416" Oct 02 13:11:27 crc kubenswrapper[4724]: I1002 13:11:27.500830 4724 generic.go:334] "Generic (PLEG): container finished" podID="4d5fd30f-d080-4759-b828-d40f2293c6c7" containerID="dd0684f0699fbad854543e98ab1b806c269389c1a77fdcb93c3e273a91509833" exitCode=0 Oct 02 13:11:27 crc kubenswrapper[4724]: I1002 13:11:27.500878 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-hrpd4" event={"ID":"4d5fd30f-d080-4759-b828-d40f2293c6c7","Type":"ContainerDied","Data":"dd0684f0699fbad854543e98ab1b806c269389c1a77fdcb93c3e273a91509833"} Oct 02 13:11:28 crc kubenswrapper[4724]: I1002 13:11:28.509118 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-hrpd4" event={"ID":"4d5fd30f-d080-4759-b828-d40f2293c6c7","Type":"ContainerStarted","Data":"5101f2bace3ab943b680ed64aebacf3bef2ba7539db4cef27b685e3084793a0c"} Oct 02 13:11:28 crc kubenswrapper[4724]: I1002 13:11:28.509391 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-hrpd4" event={"ID":"4d5fd30f-d080-4759-b828-d40f2293c6c7","Type":"ContainerStarted","Data":"98911bc7147e1f76ee20911a71a5a7c92e767500aab9dfa6a5f043f217f9d9ca"} Oct 02 13:11:28 crc kubenswrapper[4724]: I1002 13:11:28.509400 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-hrpd4" event={"ID":"4d5fd30f-d080-4759-b828-d40f2293c6c7","Type":"ContainerStarted","Data":"bdbbc966a865b84860bba51daeae9c596f5a68ce778a100f09dec12bb5306590"} Oct 02 13:11:28 crc kubenswrapper[4724]: I1002 13:11:28.509409 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-hrpd4" event={"ID":"4d5fd30f-d080-4759-b828-d40f2293c6c7","Type":"ContainerStarted","Data":"792cacc2cb05675642660b46a45ee3edaff4f59abcf421e018bbf19c551eaea2"} Oct 02 13:11:28 crc kubenswrapper[4724]: I1002 13:11:28.610263 4724 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-zpmh7"] Oct 02 13:11:28 crc kubenswrapper[4724]: I1002 13:11:28.610719 4724 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-879f6c89f-zpmh7" podUID="9699ff16-3d72-4ba6-9055-6b707c3e223f" containerName="controller-manager" containerID="cri-o://c9d96b73240d61675e5a2d0c17104172029912f815cb06339ec4a7dc1012e8bf" gracePeriod=30 Oct 02 13:11:28 crc kubenswrapper[4724]: I1002 13:11:28.735526 4724 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-qxfvd"] Oct 02 13:11:28 crc kubenswrapper[4724]: I1002 13:11:28.735743 4724 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-qxfvd" podUID="bbeff7ab-5d85-4709-bd85-22d6e99ff30c" containerName="route-controller-manager" containerID="cri-o://6c10394862d8ad4453ef8d8842c922c6a1dab042fc868311c73f30c3fa99c95b" gracePeriod=30 Oct 02 13:11:29 crc kubenswrapper[4724]: I1002 13:11:29.147677 4724 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-qxfvd" Oct 02 13:11:29 crc kubenswrapper[4724]: I1002 13:11:29.244516 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c8k7f\" (UniqueName: \"kubernetes.io/projected/bbeff7ab-5d85-4709-bd85-22d6e99ff30c-kube-api-access-c8k7f\") pod \"bbeff7ab-5d85-4709-bd85-22d6e99ff30c\" (UID: \"bbeff7ab-5d85-4709-bd85-22d6e99ff30c\") " Oct 02 13:11:29 crc kubenswrapper[4724]: I1002 13:11:29.245173 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/bbeff7ab-5d85-4709-bd85-22d6e99ff30c-client-ca\") pod \"bbeff7ab-5d85-4709-bd85-22d6e99ff30c\" (UID: \"bbeff7ab-5d85-4709-bd85-22d6e99ff30c\") " Oct 02 13:11:29 crc kubenswrapper[4724]: I1002 13:11:29.245293 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bbeff7ab-5d85-4709-bd85-22d6e99ff30c-serving-cert\") pod \"bbeff7ab-5d85-4709-bd85-22d6e99ff30c\" (UID: \"bbeff7ab-5d85-4709-bd85-22d6e99ff30c\") " Oct 02 13:11:29 crc kubenswrapper[4724]: I1002 13:11:29.245317 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bbeff7ab-5d85-4709-bd85-22d6e99ff30c-config\") pod \"bbeff7ab-5d85-4709-bd85-22d6e99ff30c\" (UID: \"bbeff7ab-5d85-4709-bd85-22d6e99ff30c\") " Oct 02 13:11:29 crc kubenswrapper[4724]: I1002 13:11:29.245776 4724 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bbeff7ab-5d85-4709-bd85-22d6e99ff30c-client-ca" (OuterVolumeSpecName: "client-ca") pod "bbeff7ab-5d85-4709-bd85-22d6e99ff30c" (UID: "bbeff7ab-5d85-4709-bd85-22d6e99ff30c"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 13:11:29 crc kubenswrapper[4724]: I1002 13:11:29.246221 4724 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bbeff7ab-5d85-4709-bd85-22d6e99ff30c-config" (OuterVolumeSpecName: "config") pod "bbeff7ab-5d85-4709-bd85-22d6e99ff30c" (UID: "bbeff7ab-5d85-4709-bd85-22d6e99ff30c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 13:11:29 crc kubenswrapper[4724]: I1002 13:11:29.251137 4724 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bbeff7ab-5d85-4709-bd85-22d6e99ff30c-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "bbeff7ab-5d85-4709-bd85-22d6e99ff30c" (UID: "bbeff7ab-5d85-4709-bd85-22d6e99ff30c"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 13:11:29 crc kubenswrapper[4724]: I1002 13:11:29.251148 4724 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bbeff7ab-5d85-4709-bd85-22d6e99ff30c-kube-api-access-c8k7f" (OuterVolumeSpecName: "kube-api-access-c8k7f") pod "bbeff7ab-5d85-4709-bd85-22d6e99ff30c" (UID: "bbeff7ab-5d85-4709-bd85-22d6e99ff30c"). InnerVolumeSpecName "kube-api-access-c8k7f". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 13:11:29 crc kubenswrapper[4724]: I1002 13:11:29.346504 4724 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bbeff7ab-5d85-4709-bd85-22d6e99ff30c-config\") on node \"crc\" DevicePath \"\"" Oct 02 13:11:29 crc kubenswrapper[4724]: I1002 13:11:29.346557 4724 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bbeff7ab-5d85-4709-bd85-22d6e99ff30c-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 02 13:11:29 crc kubenswrapper[4724]: I1002 13:11:29.346571 4724 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c8k7f\" (UniqueName: \"kubernetes.io/projected/bbeff7ab-5d85-4709-bd85-22d6e99ff30c-kube-api-access-c8k7f\") on node \"crc\" DevicePath \"\"" Oct 02 13:11:29 crc kubenswrapper[4724]: I1002 13:11:29.346584 4724 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/bbeff7ab-5d85-4709-bd85-22d6e99ff30c-client-ca\") on node \"crc\" DevicePath \"\"" Oct 02 13:11:29 crc kubenswrapper[4724]: I1002 13:11:29.467408 4724 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-zpmh7" Oct 02 13:11:29 crc kubenswrapper[4724]: I1002 13:11:29.516220 4724 generic.go:334] "Generic (PLEG): container finished" podID="bbeff7ab-5d85-4709-bd85-22d6e99ff30c" containerID="6c10394862d8ad4453ef8d8842c922c6a1dab042fc868311c73f30c3fa99c95b" exitCode=0 Oct 02 13:11:29 crc kubenswrapper[4724]: I1002 13:11:29.516286 4724 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-qxfvd" Oct 02 13:11:29 crc kubenswrapper[4724]: I1002 13:11:29.516335 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-qxfvd" event={"ID":"bbeff7ab-5d85-4709-bd85-22d6e99ff30c","Type":"ContainerDied","Data":"6c10394862d8ad4453ef8d8842c922c6a1dab042fc868311c73f30c3fa99c95b"} Oct 02 13:11:29 crc kubenswrapper[4724]: I1002 13:11:29.516399 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-qxfvd" event={"ID":"bbeff7ab-5d85-4709-bd85-22d6e99ff30c","Type":"ContainerDied","Data":"b8fbc49c6c25c04d4d5d4465d02273a374657c359202a7297bee89248aa7aac4"} Oct 02 13:11:29 crc kubenswrapper[4724]: I1002 13:11:29.516444 4724 scope.go:117] "RemoveContainer" containerID="6c10394862d8ad4453ef8d8842c922c6a1dab042fc868311c73f30c3fa99c95b" Oct 02 13:11:29 crc kubenswrapper[4724]: I1002 13:11:29.519677 4724 generic.go:334] "Generic (PLEG): container finished" podID="9699ff16-3d72-4ba6-9055-6b707c3e223f" containerID="c9d96b73240d61675e5a2d0c17104172029912f815cb06339ec4a7dc1012e8bf" exitCode=0 Oct 02 13:11:29 crc kubenswrapper[4724]: I1002 13:11:29.519744 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-zpmh7" event={"ID":"9699ff16-3d72-4ba6-9055-6b707c3e223f","Type":"ContainerDied","Data":"c9d96b73240d61675e5a2d0c17104172029912f815cb06339ec4a7dc1012e8bf"} Oct 02 13:11:29 crc kubenswrapper[4724]: I1002 13:11:29.519767 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-zpmh7" event={"ID":"9699ff16-3d72-4ba6-9055-6b707c3e223f","Type":"ContainerDied","Data":"a74d4dce04f0ae9b58c7b830d83ad2914b2ce61a3f7091f8cc6396c4dd547327"} Oct 02 13:11:29 crc kubenswrapper[4724]: I1002 13:11:29.519785 4724 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-zpmh7" Oct 02 13:11:29 crc kubenswrapper[4724]: I1002 13:11:29.530899 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-hrpd4" event={"ID":"4d5fd30f-d080-4759-b828-d40f2293c6c7","Type":"ContainerStarted","Data":"be9f0ab41c562c4448cde954bd67cfcadbb3d09a0874332e3a499713099e4c06"} Oct 02 13:11:29 crc kubenswrapper[4724]: I1002 13:11:29.530936 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-hrpd4" event={"ID":"4d5fd30f-d080-4759-b828-d40f2293c6c7","Type":"ContainerStarted","Data":"7dda796e64b1b30219cea709d2b3fec3cc0baaf22ba51f045dd72a950d34ffac"} Oct 02 13:11:29 crc kubenswrapper[4724]: I1002 13:11:29.531667 4724 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-hrpd4" Oct 02 13:11:29 crc kubenswrapper[4724]: I1002 13:11:29.534711 4724 scope.go:117] "RemoveContainer" containerID="6c10394862d8ad4453ef8d8842c922c6a1dab042fc868311c73f30c3fa99c95b" Oct 02 13:11:29 crc kubenswrapper[4724]: E1002 13:11:29.535048 4724 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6c10394862d8ad4453ef8d8842c922c6a1dab042fc868311c73f30c3fa99c95b\": container with ID starting with 6c10394862d8ad4453ef8d8842c922c6a1dab042fc868311c73f30c3fa99c95b not found: ID does not exist" containerID="6c10394862d8ad4453ef8d8842c922c6a1dab042fc868311c73f30c3fa99c95b" Oct 02 13:11:29 crc kubenswrapper[4724]: I1002 13:11:29.535075 4724 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6c10394862d8ad4453ef8d8842c922c6a1dab042fc868311c73f30c3fa99c95b"} err="failed to get container status \"6c10394862d8ad4453ef8d8842c922c6a1dab042fc868311c73f30c3fa99c95b\": rpc error: code = NotFound desc = could not find container \"6c10394862d8ad4453ef8d8842c922c6a1dab042fc868311c73f30c3fa99c95b\": container with ID starting with 6c10394862d8ad4453ef8d8842c922c6a1dab042fc868311c73f30c3fa99c95b not found: ID does not exist" Oct 02 13:11:29 crc kubenswrapper[4724]: I1002 13:11:29.535093 4724 scope.go:117] "RemoveContainer" containerID="c9d96b73240d61675e5a2d0c17104172029912f815cb06339ec4a7dc1012e8bf" Oct 02 13:11:29 crc kubenswrapper[4724]: I1002 13:11:29.543635 4724 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-qxfvd"] Oct 02 13:11:29 crc kubenswrapper[4724]: I1002 13:11:29.548862 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9699ff16-3d72-4ba6-9055-6b707c3e223f-config\") pod \"9699ff16-3d72-4ba6-9055-6b707c3e223f\" (UID: \"9699ff16-3d72-4ba6-9055-6b707c3e223f\") " Oct 02 13:11:29 crc kubenswrapper[4724]: I1002 13:11:29.548910 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xn64z\" (UniqueName: \"kubernetes.io/projected/9699ff16-3d72-4ba6-9055-6b707c3e223f-kube-api-access-xn64z\") pod \"9699ff16-3d72-4ba6-9055-6b707c3e223f\" (UID: \"9699ff16-3d72-4ba6-9055-6b707c3e223f\") " Oct 02 13:11:29 crc kubenswrapper[4724]: I1002 13:11:29.548948 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9699ff16-3d72-4ba6-9055-6b707c3e223f-serving-cert\") pod \"9699ff16-3d72-4ba6-9055-6b707c3e223f\" (UID: \"9699ff16-3d72-4ba6-9055-6b707c3e223f\") " Oct 02 13:11:29 crc kubenswrapper[4724]: I1002 13:11:29.548981 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/9699ff16-3d72-4ba6-9055-6b707c3e223f-client-ca\") pod \"9699ff16-3d72-4ba6-9055-6b707c3e223f\" (UID: \"9699ff16-3d72-4ba6-9055-6b707c3e223f\") " Oct 02 13:11:29 crc kubenswrapper[4724]: I1002 13:11:29.549078 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/9699ff16-3d72-4ba6-9055-6b707c3e223f-proxy-ca-bundles\") pod \"9699ff16-3d72-4ba6-9055-6b707c3e223f\" (UID: \"9699ff16-3d72-4ba6-9055-6b707c3e223f\") " Oct 02 13:11:29 crc kubenswrapper[4724]: I1002 13:11:29.549747 4724 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9699ff16-3d72-4ba6-9055-6b707c3e223f-client-ca" (OuterVolumeSpecName: "client-ca") pod "9699ff16-3d72-4ba6-9055-6b707c3e223f" (UID: "9699ff16-3d72-4ba6-9055-6b707c3e223f"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 13:11:29 crc kubenswrapper[4724]: I1002 13:11:29.549777 4724 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9699ff16-3d72-4ba6-9055-6b707c3e223f-config" (OuterVolumeSpecName: "config") pod "9699ff16-3d72-4ba6-9055-6b707c3e223f" (UID: "9699ff16-3d72-4ba6-9055-6b707c3e223f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 13:11:29 crc kubenswrapper[4724]: I1002 13:11:29.549824 4724 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9699ff16-3d72-4ba6-9055-6b707c3e223f-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "9699ff16-3d72-4ba6-9055-6b707c3e223f" (UID: "9699ff16-3d72-4ba6-9055-6b707c3e223f"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 13:11:29 crc kubenswrapper[4724]: I1002 13:11:29.552182 4724 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-qxfvd"] Oct 02 13:11:29 crc kubenswrapper[4724]: I1002 13:11:29.553615 4724 scope.go:117] "RemoveContainer" containerID="c9d96b73240d61675e5a2d0c17104172029912f815cb06339ec4a7dc1012e8bf" Oct 02 13:11:29 crc kubenswrapper[4724]: I1002 13:11:29.554040 4724 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9699ff16-3d72-4ba6-9055-6b707c3e223f-kube-api-access-xn64z" (OuterVolumeSpecName: "kube-api-access-xn64z") pod "9699ff16-3d72-4ba6-9055-6b707c3e223f" (UID: "9699ff16-3d72-4ba6-9055-6b707c3e223f"). InnerVolumeSpecName "kube-api-access-xn64z". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 13:11:29 crc kubenswrapper[4724]: E1002 13:11:29.554113 4724 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c9d96b73240d61675e5a2d0c17104172029912f815cb06339ec4a7dc1012e8bf\": container with ID starting with c9d96b73240d61675e5a2d0c17104172029912f815cb06339ec4a7dc1012e8bf not found: ID does not exist" containerID="c9d96b73240d61675e5a2d0c17104172029912f815cb06339ec4a7dc1012e8bf" Oct 02 13:11:29 crc kubenswrapper[4724]: I1002 13:11:29.554144 4724 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c9d96b73240d61675e5a2d0c17104172029912f815cb06339ec4a7dc1012e8bf"} err="failed to get container status \"c9d96b73240d61675e5a2d0c17104172029912f815cb06339ec4a7dc1012e8bf\": rpc error: code = NotFound desc = could not find container \"c9d96b73240d61675e5a2d0c17104172029912f815cb06339ec4a7dc1012e8bf\": container with ID starting with c9d96b73240d61675e5a2d0c17104172029912f815cb06339ec4a7dc1012e8bf not found: ID does not exist" Oct 02 13:11:29 crc kubenswrapper[4724]: I1002 13:11:29.554714 4724 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9699ff16-3d72-4ba6-9055-6b707c3e223f-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "9699ff16-3d72-4ba6-9055-6b707c3e223f" (UID: "9699ff16-3d72-4ba6-9055-6b707c3e223f"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 13:11:29 crc kubenswrapper[4724]: I1002 13:11:29.573386 4724 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-hrpd4" podStartSLOduration=7.66603441 podStartE2EDuration="23.573368446s" podCreationTimestamp="2025-10-02 13:11:06 +0000 UTC" firstStartedPulling="2025-10-02 13:11:08.661358875 +0000 UTC m=+733.116118006" lastFinishedPulling="2025-10-02 13:11:24.568692921 +0000 UTC m=+749.023452042" observedRunningTime="2025-10-02 13:11:29.570862046 +0000 UTC m=+754.025621197" watchObservedRunningTime="2025-10-02 13:11:29.573368446 +0000 UTC m=+754.028127567" Oct 02 13:11:29 crc kubenswrapper[4724]: I1002 13:11:29.651116 4724 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/9699ff16-3d72-4ba6-9055-6b707c3e223f-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Oct 02 13:11:29 crc kubenswrapper[4724]: I1002 13:11:29.651359 4724 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9699ff16-3d72-4ba6-9055-6b707c3e223f-config\") on node \"crc\" DevicePath \"\"" Oct 02 13:11:29 crc kubenswrapper[4724]: I1002 13:11:29.651419 4724 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xn64z\" (UniqueName: \"kubernetes.io/projected/9699ff16-3d72-4ba6-9055-6b707c3e223f-kube-api-access-xn64z\") on node \"crc\" DevicePath \"\"" Oct 02 13:11:29 crc kubenswrapper[4724]: I1002 13:11:29.651475 4724 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9699ff16-3d72-4ba6-9055-6b707c3e223f-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 02 13:11:29 crc kubenswrapper[4724]: I1002 13:11:29.651526 4724 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/9699ff16-3d72-4ba6-9055-6b707c3e223f-client-ca\") on node \"crc\" DevicePath \"\"" Oct 02 13:11:29 crc kubenswrapper[4724]: I1002 13:11:29.846724 4724 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-zpmh7"] Oct 02 13:11:29 crc kubenswrapper[4724]: I1002 13:11:29.849483 4724 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-zpmh7"] Oct 02 13:11:29 crc kubenswrapper[4724]: I1002 13:11:29.926509 4724 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7d695bbf66-spw5h"] Oct 02 13:11:29 crc kubenswrapper[4724]: E1002 13:11:29.926965 4724 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9699ff16-3d72-4ba6-9055-6b707c3e223f" containerName="controller-manager" Oct 02 13:11:29 crc kubenswrapper[4724]: I1002 13:11:29.926989 4724 state_mem.go:107] "Deleted CPUSet assignment" podUID="9699ff16-3d72-4ba6-9055-6b707c3e223f" containerName="controller-manager" Oct 02 13:11:29 crc kubenswrapper[4724]: E1002 13:11:29.927009 4724 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bbeff7ab-5d85-4709-bd85-22d6e99ff30c" containerName="route-controller-manager" Oct 02 13:11:29 crc kubenswrapper[4724]: I1002 13:11:29.927017 4724 state_mem.go:107] "Deleted CPUSet assignment" podUID="bbeff7ab-5d85-4709-bd85-22d6e99ff30c" containerName="route-controller-manager" Oct 02 13:11:29 crc kubenswrapper[4724]: I1002 13:11:29.927146 4724 memory_manager.go:354] "RemoveStaleState removing state" podUID="bbeff7ab-5d85-4709-bd85-22d6e99ff30c" containerName="route-controller-manager" Oct 02 13:11:29 crc kubenswrapper[4724]: I1002 13:11:29.927171 4724 memory_manager.go:354] "RemoveStaleState removing state" podUID="9699ff16-3d72-4ba6-9055-6b707c3e223f" containerName="controller-manager" Oct 02 13:11:29 crc kubenswrapper[4724]: I1002 13:11:29.927701 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7d695bbf66-spw5h" Oct 02 13:11:29 crc kubenswrapper[4724]: I1002 13:11:29.929504 4724 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-6788d9cff4-w5kqj"] Oct 02 13:11:29 crc kubenswrapper[4724]: I1002 13:11:29.930237 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6788d9cff4-w5kqj" Oct 02 13:11:29 crc kubenswrapper[4724]: I1002 13:11:29.930891 4724 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Oct 02 13:11:29 crc kubenswrapper[4724]: I1002 13:11:29.930933 4724 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Oct 02 13:11:29 crc kubenswrapper[4724]: I1002 13:11:29.930894 4724 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Oct 02 13:11:29 crc kubenswrapper[4724]: I1002 13:11:29.932585 4724 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Oct 02 13:11:29 crc kubenswrapper[4724]: I1002 13:11:29.932685 4724 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Oct 02 13:11:29 crc kubenswrapper[4724]: I1002 13:11:29.933594 4724 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Oct 02 13:11:29 crc kubenswrapper[4724]: I1002 13:11:29.933598 4724 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Oct 02 13:11:29 crc kubenswrapper[4724]: I1002 13:11:29.933598 4724 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Oct 02 13:11:29 crc kubenswrapper[4724]: I1002 13:11:29.933671 4724 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Oct 02 13:11:29 crc kubenswrapper[4724]: I1002 13:11:29.933860 4724 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Oct 02 13:11:29 crc kubenswrapper[4724]: I1002 13:11:29.934137 4724 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Oct 02 13:11:29 crc kubenswrapper[4724]: I1002 13:11:29.940301 4724 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7d695bbf66-spw5h"] Oct 02 13:11:29 crc kubenswrapper[4724]: I1002 13:11:29.942987 4724 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Oct 02 13:11:29 crc kubenswrapper[4724]: I1002 13:11:29.945958 4724 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Oct 02 13:11:29 crc kubenswrapper[4724]: I1002 13:11:29.948349 4724 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-6788d9cff4-w5kqj"] Oct 02 13:11:30 crc kubenswrapper[4724]: I1002 13:11:30.056499 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qrl8j\" (UniqueName: \"kubernetes.io/projected/9aabb18e-2b80-4dc7-99f6-4aa565f67c0c-kube-api-access-qrl8j\") pod \"controller-manager-6788d9cff4-w5kqj\" (UID: \"9aabb18e-2b80-4dc7-99f6-4aa565f67c0c\") " pod="openshift-controller-manager/controller-manager-6788d9cff4-w5kqj" Oct 02 13:11:30 crc kubenswrapper[4724]: I1002 13:11:30.056679 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9aabb18e-2b80-4dc7-99f6-4aa565f67c0c-config\") pod \"controller-manager-6788d9cff4-w5kqj\" (UID: \"9aabb18e-2b80-4dc7-99f6-4aa565f67c0c\") " pod="openshift-controller-manager/controller-manager-6788d9cff4-w5kqj" Oct 02 13:11:30 crc kubenswrapper[4724]: I1002 13:11:30.056721 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/9aabb18e-2b80-4dc7-99f6-4aa565f67c0c-proxy-ca-bundles\") pod \"controller-manager-6788d9cff4-w5kqj\" (UID: \"9aabb18e-2b80-4dc7-99f6-4aa565f67c0c\") " pod="openshift-controller-manager/controller-manager-6788d9cff4-w5kqj" Oct 02 13:11:30 crc kubenswrapper[4724]: I1002 13:11:30.056881 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d7a64392-161e-4f37-8cda-e7e52fdaa743-serving-cert\") pod \"route-controller-manager-7d695bbf66-spw5h\" (UID: \"d7a64392-161e-4f37-8cda-e7e52fdaa743\") " pod="openshift-route-controller-manager/route-controller-manager-7d695bbf66-spw5h" Oct 02 13:11:30 crc kubenswrapper[4724]: I1002 13:11:30.056984 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9aabb18e-2b80-4dc7-99f6-4aa565f67c0c-serving-cert\") pod \"controller-manager-6788d9cff4-w5kqj\" (UID: \"9aabb18e-2b80-4dc7-99f6-4aa565f67c0c\") " pod="openshift-controller-manager/controller-manager-6788d9cff4-w5kqj" Oct 02 13:11:30 crc kubenswrapper[4724]: I1002 13:11:30.057020 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d7a64392-161e-4f37-8cda-e7e52fdaa743-config\") pod \"route-controller-manager-7d695bbf66-spw5h\" (UID: \"d7a64392-161e-4f37-8cda-e7e52fdaa743\") " pod="openshift-route-controller-manager/route-controller-manager-7d695bbf66-spw5h" Oct 02 13:11:30 crc kubenswrapper[4724]: I1002 13:11:30.057093 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/9aabb18e-2b80-4dc7-99f6-4aa565f67c0c-client-ca\") pod \"controller-manager-6788d9cff4-w5kqj\" (UID: \"9aabb18e-2b80-4dc7-99f6-4aa565f67c0c\") " pod="openshift-controller-manager/controller-manager-6788d9cff4-w5kqj" Oct 02 13:11:30 crc kubenswrapper[4724]: I1002 13:11:30.057137 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4x59c\" (UniqueName: \"kubernetes.io/projected/d7a64392-161e-4f37-8cda-e7e52fdaa743-kube-api-access-4x59c\") pod \"route-controller-manager-7d695bbf66-spw5h\" (UID: \"d7a64392-161e-4f37-8cda-e7e52fdaa743\") " pod="openshift-route-controller-manager/route-controller-manager-7d695bbf66-spw5h" Oct 02 13:11:30 crc kubenswrapper[4724]: I1002 13:11:30.057165 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d7a64392-161e-4f37-8cda-e7e52fdaa743-client-ca\") pod \"route-controller-manager-7d695bbf66-spw5h\" (UID: \"d7a64392-161e-4f37-8cda-e7e52fdaa743\") " pod="openshift-route-controller-manager/route-controller-manager-7d695bbf66-spw5h" Oct 02 13:11:30 crc kubenswrapper[4724]: I1002 13:11:30.157978 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9aabb18e-2b80-4dc7-99f6-4aa565f67c0c-serving-cert\") pod \"controller-manager-6788d9cff4-w5kqj\" (UID: \"9aabb18e-2b80-4dc7-99f6-4aa565f67c0c\") " pod="openshift-controller-manager/controller-manager-6788d9cff4-w5kqj" Oct 02 13:11:30 crc kubenswrapper[4724]: I1002 13:11:30.158040 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d7a64392-161e-4f37-8cda-e7e52fdaa743-config\") pod \"route-controller-manager-7d695bbf66-spw5h\" (UID: \"d7a64392-161e-4f37-8cda-e7e52fdaa743\") " pod="openshift-route-controller-manager/route-controller-manager-7d695bbf66-spw5h" Oct 02 13:11:30 crc kubenswrapper[4724]: I1002 13:11:30.158094 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/9aabb18e-2b80-4dc7-99f6-4aa565f67c0c-client-ca\") pod \"controller-manager-6788d9cff4-w5kqj\" (UID: \"9aabb18e-2b80-4dc7-99f6-4aa565f67c0c\") " pod="openshift-controller-manager/controller-manager-6788d9cff4-w5kqj" Oct 02 13:11:30 crc kubenswrapper[4724]: I1002 13:11:30.158155 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4x59c\" (UniqueName: \"kubernetes.io/projected/d7a64392-161e-4f37-8cda-e7e52fdaa743-kube-api-access-4x59c\") pod \"route-controller-manager-7d695bbf66-spw5h\" (UID: \"d7a64392-161e-4f37-8cda-e7e52fdaa743\") " pod="openshift-route-controller-manager/route-controller-manager-7d695bbf66-spw5h" Oct 02 13:11:30 crc kubenswrapper[4724]: I1002 13:11:30.158179 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d7a64392-161e-4f37-8cda-e7e52fdaa743-client-ca\") pod \"route-controller-manager-7d695bbf66-spw5h\" (UID: \"d7a64392-161e-4f37-8cda-e7e52fdaa743\") " pod="openshift-route-controller-manager/route-controller-manager-7d695bbf66-spw5h" Oct 02 13:11:30 crc kubenswrapper[4724]: I1002 13:11:30.158217 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qrl8j\" (UniqueName: \"kubernetes.io/projected/9aabb18e-2b80-4dc7-99f6-4aa565f67c0c-kube-api-access-qrl8j\") pod \"controller-manager-6788d9cff4-w5kqj\" (UID: \"9aabb18e-2b80-4dc7-99f6-4aa565f67c0c\") " pod="openshift-controller-manager/controller-manager-6788d9cff4-w5kqj" Oct 02 13:11:30 crc kubenswrapper[4724]: I1002 13:11:30.158255 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9aabb18e-2b80-4dc7-99f6-4aa565f67c0c-config\") pod \"controller-manager-6788d9cff4-w5kqj\" (UID: \"9aabb18e-2b80-4dc7-99f6-4aa565f67c0c\") " pod="openshift-controller-manager/controller-manager-6788d9cff4-w5kqj" Oct 02 13:11:30 crc kubenswrapper[4724]: I1002 13:11:30.158277 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/9aabb18e-2b80-4dc7-99f6-4aa565f67c0c-proxy-ca-bundles\") pod \"controller-manager-6788d9cff4-w5kqj\" (UID: \"9aabb18e-2b80-4dc7-99f6-4aa565f67c0c\") " pod="openshift-controller-manager/controller-manager-6788d9cff4-w5kqj" Oct 02 13:11:30 crc kubenswrapper[4724]: I1002 13:11:30.158318 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d7a64392-161e-4f37-8cda-e7e52fdaa743-serving-cert\") pod \"route-controller-manager-7d695bbf66-spw5h\" (UID: \"d7a64392-161e-4f37-8cda-e7e52fdaa743\") " pod="openshift-route-controller-manager/route-controller-manager-7d695bbf66-spw5h" Oct 02 13:11:30 crc kubenswrapper[4724]: I1002 13:11:30.159722 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d7a64392-161e-4f37-8cda-e7e52fdaa743-client-ca\") pod \"route-controller-manager-7d695bbf66-spw5h\" (UID: \"d7a64392-161e-4f37-8cda-e7e52fdaa743\") " pod="openshift-route-controller-manager/route-controller-manager-7d695bbf66-spw5h" Oct 02 13:11:30 crc kubenswrapper[4724]: I1002 13:11:30.159950 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/9aabb18e-2b80-4dc7-99f6-4aa565f67c0c-client-ca\") pod \"controller-manager-6788d9cff4-w5kqj\" (UID: \"9aabb18e-2b80-4dc7-99f6-4aa565f67c0c\") " pod="openshift-controller-manager/controller-manager-6788d9cff4-w5kqj" Oct 02 13:11:30 crc kubenswrapper[4724]: I1002 13:11:30.160072 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d7a64392-161e-4f37-8cda-e7e52fdaa743-config\") pod \"route-controller-manager-7d695bbf66-spw5h\" (UID: \"d7a64392-161e-4f37-8cda-e7e52fdaa743\") " pod="openshift-route-controller-manager/route-controller-manager-7d695bbf66-spw5h" Oct 02 13:11:30 crc kubenswrapper[4724]: I1002 13:11:30.160779 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9aabb18e-2b80-4dc7-99f6-4aa565f67c0c-config\") pod \"controller-manager-6788d9cff4-w5kqj\" (UID: \"9aabb18e-2b80-4dc7-99f6-4aa565f67c0c\") " pod="openshift-controller-manager/controller-manager-6788d9cff4-w5kqj" Oct 02 13:11:30 crc kubenswrapper[4724]: I1002 13:11:30.160776 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/9aabb18e-2b80-4dc7-99f6-4aa565f67c0c-proxy-ca-bundles\") pod \"controller-manager-6788d9cff4-w5kqj\" (UID: \"9aabb18e-2b80-4dc7-99f6-4aa565f67c0c\") " pod="openshift-controller-manager/controller-manager-6788d9cff4-w5kqj" Oct 02 13:11:30 crc kubenswrapper[4724]: I1002 13:11:30.164655 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d7a64392-161e-4f37-8cda-e7e52fdaa743-serving-cert\") pod \"route-controller-manager-7d695bbf66-spw5h\" (UID: \"d7a64392-161e-4f37-8cda-e7e52fdaa743\") " pod="openshift-route-controller-manager/route-controller-manager-7d695bbf66-spw5h" Oct 02 13:11:30 crc kubenswrapper[4724]: I1002 13:11:30.164655 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9aabb18e-2b80-4dc7-99f6-4aa565f67c0c-serving-cert\") pod \"controller-manager-6788d9cff4-w5kqj\" (UID: \"9aabb18e-2b80-4dc7-99f6-4aa565f67c0c\") " pod="openshift-controller-manager/controller-manager-6788d9cff4-w5kqj" Oct 02 13:11:30 crc kubenswrapper[4724]: I1002 13:11:30.174653 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4x59c\" (UniqueName: \"kubernetes.io/projected/d7a64392-161e-4f37-8cda-e7e52fdaa743-kube-api-access-4x59c\") pod \"route-controller-manager-7d695bbf66-spw5h\" (UID: \"d7a64392-161e-4f37-8cda-e7e52fdaa743\") " pod="openshift-route-controller-manager/route-controller-manager-7d695bbf66-spw5h" Oct 02 13:11:30 crc kubenswrapper[4724]: I1002 13:11:30.175141 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qrl8j\" (UniqueName: \"kubernetes.io/projected/9aabb18e-2b80-4dc7-99f6-4aa565f67c0c-kube-api-access-qrl8j\") pod \"controller-manager-6788d9cff4-w5kqj\" (UID: \"9aabb18e-2b80-4dc7-99f6-4aa565f67c0c\") " pod="openshift-controller-manager/controller-manager-6788d9cff4-w5kqj" Oct 02 13:11:30 crc kubenswrapper[4724]: I1002 13:11:30.247045 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7d695bbf66-spw5h" Oct 02 13:11:30 crc kubenswrapper[4724]: I1002 13:11:30.259896 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6788d9cff4-w5kqj" Oct 02 13:11:30 crc kubenswrapper[4724]: I1002 13:11:30.322583 4724 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9699ff16-3d72-4ba6-9055-6b707c3e223f" path="/var/lib/kubelet/pods/9699ff16-3d72-4ba6-9055-6b707c3e223f/volumes" Oct 02 13:11:30 crc kubenswrapper[4724]: I1002 13:11:30.323161 4724 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bbeff7ab-5d85-4709-bd85-22d6e99ff30c" path="/var/lib/kubelet/pods/bbeff7ab-5d85-4709-bd85-22d6e99ff30c/volumes" Oct 02 13:11:30 crc kubenswrapper[4724]: I1002 13:11:30.476312 4724 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7d695bbf66-spw5h"] Oct 02 13:11:30 crc kubenswrapper[4724]: W1002 13:11:30.483709 4724 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd7a64392_161e_4f37_8cda_e7e52fdaa743.slice/crio-b9db1b7d433bda8b2d243766ac7d9a27966bb02634319ae7dddeed5c42e25ca4 WatchSource:0}: Error finding container b9db1b7d433bda8b2d243766ac7d9a27966bb02634319ae7dddeed5c42e25ca4: Status 404 returned error can't find the container with id b9db1b7d433bda8b2d243766ac7d9a27966bb02634319ae7dddeed5c42e25ca4 Oct 02 13:11:30 crc kubenswrapper[4724]: I1002 13:11:30.548461 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7d695bbf66-spw5h" event={"ID":"d7a64392-161e-4f37-8cda-e7e52fdaa743","Type":"ContainerStarted","Data":"b9db1b7d433bda8b2d243766ac7d9a27966bb02634319ae7dddeed5c42e25ca4"} Oct 02 13:11:30 crc kubenswrapper[4724]: I1002 13:11:30.727112 4724 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-6788d9cff4-w5kqj"] Oct 02 13:11:30 crc kubenswrapper[4724]: W1002 13:11:30.731139 4724 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9aabb18e_2b80_4dc7_99f6_4aa565f67c0c.slice/crio-377f9ca618e66095cc536f0bb1b4cbcafb5e7d5d392bbe8e301359566858ccb7 WatchSource:0}: Error finding container 377f9ca618e66095cc536f0bb1b4cbcafb5e7d5d392bbe8e301359566858ccb7: Status 404 returned error can't find the container with id 377f9ca618e66095cc536f0bb1b4cbcafb5e7d5d392bbe8e301359566858ccb7 Oct 02 13:11:31 crc kubenswrapper[4724]: I1002 13:11:31.558587 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7d695bbf66-spw5h" event={"ID":"d7a64392-161e-4f37-8cda-e7e52fdaa743","Type":"ContainerStarted","Data":"6033632a2a769e4da082c4b3d971f662a39e0fb3fea881e8f113c15f9d4d4230"} Oct 02 13:11:31 crc kubenswrapper[4724]: I1002 13:11:31.559017 4724 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-7d695bbf66-spw5h" Oct 02 13:11:31 crc kubenswrapper[4724]: I1002 13:11:31.560505 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6788d9cff4-w5kqj" event={"ID":"9aabb18e-2b80-4dc7-99f6-4aa565f67c0c","Type":"ContainerStarted","Data":"25959d8b6158148459bd643e20089aa972f0d97a15d2ca9ddb496f6ad3663f93"} Oct 02 13:11:31 crc kubenswrapper[4724]: I1002 13:11:31.560584 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6788d9cff4-w5kqj" event={"ID":"9aabb18e-2b80-4dc7-99f6-4aa565f67c0c","Type":"ContainerStarted","Data":"377f9ca618e66095cc536f0bb1b4cbcafb5e7d5d392bbe8e301359566858ccb7"} Oct 02 13:11:31 crc kubenswrapper[4724]: I1002 13:11:31.560885 4724 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-6788d9cff4-w5kqj" Oct 02 13:11:31 crc kubenswrapper[4724]: I1002 13:11:31.566831 4724 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-7d695bbf66-spw5h" Oct 02 13:11:31 crc kubenswrapper[4724]: I1002 13:11:31.566935 4724 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-6788d9cff4-w5kqj" Oct 02 13:11:31 crc kubenswrapper[4724]: I1002 13:11:31.581639 4724 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-7d695bbf66-spw5h" podStartSLOduration=3.581621608 podStartE2EDuration="3.581621608s" podCreationTimestamp="2025-10-02 13:11:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 13:11:31.579046766 +0000 UTC m=+756.033805887" watchObservedRunningTime="2025-10-02 13:11:31.581621608 +0000 UTC m=+756.036380719" Oct 02 13:11:31 crc kubenswrapper[4724]: I1002 13:11:31.631945 4724 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-6788d9cff4-w5kqj" podStartSLOduration=3.631928487 podStartE2EDuration="3.631928487s" podCreationTimestamp="2025-10-02 13:11:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 13:11:31.630231089 +0000 UTC m=+756.084990230" watchObservedRunningTime="2025-10-02 13:11:31.631928487 +0000 UTC m=+756.086687608" Oct 02 13:11:31 crc kubenswrapper[4724]: I1002 13:11:31.787054 4724 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/mariadb-operator-index-c2vnt"] Oct 02 13:11:31 crc kubenswrapper[4724]: I1002 13:11:31.787905 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-index-c2vnt" Oct 02 13:11:31 crc kubenswrapper[4724]: I1002 13:11:31.791075 4724 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"openshift-service-ca.crt" Oct 02 13:11:31 crc kubenswrapper[4724]: I1002 13:11:31.792858 4724 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"kube-root-ca.crt" Oct 02 13:11:31 crc kubenswrapper[4724]: I1002 13:11:31.798013 4724 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-index-c2vnt"] Oct 02 13:11:31 crc kubenswrapper[4724]: I1002 13:11:31.881257 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bg7zk\" (UniqueName: \"kubernetes.io/projected/6f5065f2-e2e5-4b43-b548-c37deee52892-kube-api-access-bg7zk\") pod \"mariadb-operator-index-c2vnt\" (UID: \"6f5065f2-e2e5-4b43-b548-c37deee52892\") " pod="openstack-operators/mariadb-operator-index-c2vnt" Oct 02 13:11:31 crc kubenswrapper[4724]: I1002 13:11:31.983367 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bg7zk\" (UniqueName: \"kubernetes.io/projected/6f5065f2-e2e5-4b43-b548-c37deee52892-kube-api-access-bg7zk\") pod \"mariadb-operator-index-c2vnt\" (UID: \"6f5065f2-e2e5-4b43-b548-c37deee52892\") " pod="openstack-operators/mariadb-operator-index-c2vnt" Oct 02 13:11:32 crc kubenswrapper[4724]: I1002 13:11:32.003753 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bg7zk\" (UniqueName: \"kubernetes.io/projected/6f5065f2-e2e5-4b43-b548-c37deee52892-kube-api-access-bg7zk\") pod \"mariadb-operator-index-c2vnt\" (UID: \"6f5065f2-e2e5-4b43-b548-c37deee52892\") " pod="openstack-operators/mariadb-operator-index-c2vnt" Oct 02 13:11:32 crc kubenswrapper[4724]: I1002 13:11:32.108265 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-index-c2vnt" Oct 02 13:11:32 crc kubenswrapper[4724]: I1002 13:11:32.249095 4724 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="metallb-system/frr-k8s-hrpd4" Oct 02 13:11:32 crc kubenswrapper[4724]: I1002 13:11:32.290362 4724 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="metallb-system/frr-k8s-hrpd4" Oct 02 13:11:32 crc kubenswrapper[4724]: I1002 13:11:32.294028 4724 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-index-c2vnt"] Oct 02 13:11:32 crc kubenswrapper[4724]: W1002 13:11:32.307835 4724 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6f5065f2_e2e5_4b43_b548_c37deee52892.slice/crio-8ba7445f7664f764d09cb16c2b7ededba119d121d83a007ef8144f650be438ec WatchSource:0}: Error finding container 8ba7445f7664f764d09cb16c2b7ededba119d121d83a007ef8144f650be438ec: Status 404 returned error can't find the container with id 8ba7445f7664f764d09cb16c2b7ededba119d121d83a007ef8144f650be438ec Oct 02 13:11:32 crc kubenswrapper[4724]: I1002 13:11:32.568879 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-index-c2vnt" event={"ID":"6f5065f2-e2e5-4b43-b548-c37deee52892","Type":"ContainerStarted","Data":"8ba7445f7664f764d09cb16c2b7ededba119d121d83a007ef8144f650be438ec"} Oct 02 13:11:33 crc kubenswrapper[4724]: I1002 13:11:33.574916 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-index-c2vnt" event={"ID":"6f5065f2-e2e5-4b43-b548-c37deee52892","Type":"ContainerStarted","Data":"862e462ed70f58e36c215ada274d685d330cb398c5a3a49badad5d561d1db086"} Oct 02 13:11:33 crc kubenswrapper[4724]: I1002 13:11:33.595862 4724 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/mariadb-operator-index-c2vnt" podStartSLOduration=1.499067197 podStartE2EDuration="2.595837727s" podCreationTimestamp="2025-10-02 13:11:31 +0000 UTC" firstStartedPulling="2025-10-02 13:11:32.309132099 +0000 UTC m=+756.763891220" lastFinishedPulling="2025-10-02 13:11:33.405902629 +0000 UTC m=+757.860661750" observedRunningTime="2025-10-02 13:11:33.591439224 +0000 UTC m=+758.046198345" watchObservedRunningTime="2025-10-02 13:11:33.595837727 +0000 UTC m=+758.050596848" Oct 02 13:11:34 crc kubenswrapper[4724]: I1002 13:11:34.584712 4724 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Oct 02 13:11:34 crc kubenswrapper[4724]: I1002 13:11:34.734853 4724 patch_prober.go:28] interesting pod/machine-config-daemon-74k4t container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 02 13:11:34 crc kubenswrapper[4724]: I1002 13:11:34.734923 4724 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-74k4t" podUID="f6090eaa-c182-4788-950c-16352c271233" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 02 13:11:35 crc kubenswrapper[4724]: I1002 13:11:35.164391 4724 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/mariadb-operator-index-c2vnt"] Oct 02 13:11:35 crc kubenswrapper[4724]: I1002 13:11:35.592186 4724 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/mariadb-operator-index-c2vnt" podUID="6f5065f2-e2e5-4b43-b548-c37deee52892" containerName="registry-server" containerID="cri-o://862e462ed70f58e36c215ada274d685d330cb398c5a3a49badad5d561d1db086" gracePeriod=2 Oct 02 13:11:35 crc kubenswrapper[4724]: I1002 13:11:35.774929 4724 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/mariadb-operator-index-4st9w"] Oct 02 13:11:35 crc kubenswrapper[4724]: I1002 13:11:35.775859 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-index-4st9w" Oct 02 13:11:35 crc kubenswrapper[4724]: I1002 13:11:35.779750 4724 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"mariadb-operator-index-dockercfg-qfp5z" Oct 02 13:11:35 crc kubenswrapper[4724]: I1002 13:11:35.789489 4724 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-index-4st9w"] Oct 02 13:11:35 crc kubenswrapper[4724]: I1002 13:11:35.849796 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w5t7m\" (UniqueName: \"kubernetes.io/projected/29c00177-6564-4e27-a1de-8f60e8dfc89c-kube-api-access-w5t7m\") pod \"mariadb-operator-index-4st9w\" (UID: \"29c00177-6564-4e27-a1de-8f60e8dfc89c\") " pod="openstack-operators/mariadb-operator-index-4st9w" Oct 02 13:11:35 crc kubenswrapper[4724]: I1002 13:11:35.951087 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w5t7m\" (UniqueName: \"kubernetes.io/projected/29c00177-6564-4e27-a1de-8f60e8dfc89c-kube-api-access-w5t7m\") pod \"mariadb-operator-index-4st9w\" (UID: \"29c00177-6564-4e27-a1de-8f60e8dfc89c\") " pod="openstack-operators/mariadb-operator-index-4st9w" Oct 02 13:11:35 crc kubenswrapper[4724]: I1002 13:11:35.971062 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w5t7m\" (UniqueName: \"kubernetes.io/projected/29c00177-6564-4e27-a1de-8f60e8dfc89c-kube-api-access-w5t7m\") pod \"mariadb-operator-index-4st9w\" (UID: \"29c00177-6564-4e27-a1de-8f60e8dfc89c\") " pod="openstack-operators/mariadb-operator-index-4st9w" Oct 02 13:11:36 crc kubenswrapper[4724]: I1002 13:11:36.054714 4724 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-index-c2vnt" Oct 02 13:11:36 crc kubenswrapper[4724]: I1002 13:11:36.099804 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-index-4st9w" Oct 02 13:11:36 crc kubenswrapper[4724]: I1002 13:11:36.153052 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bg7zk\" (UniqueName: \"kubernetes.io/projected/6f5065f2-e2e5-4b43-b548-c37deee52892-kube-api-access-bg7zk\") pod \"6f5065f2-e2e5-4b43-b548-c37deee52892\" (UID: \"6f5065f2-e2e5-4b43-b548-c37deee52892\") " Oct 02 13:11:36 crc kubenswrapper[4724]: I1002 13:11:36.157212 4724 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6f5065f2-e2e5-4b43-b548-c37deee52892-kube-api-access-bg7zk" (OuterVolumeSpecName: "kube-api-access-bg7zk") pod "6f5065f2-e2e5-4b43-b548-c37deee52892" (UID: "6f5065f2-e2e5-4b43-b548-c37deee52892"). InnerVolumeSpecName "kube-api-access-bg7zk". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 13:11:36 crc kubenswrapper[4724]: I1002 13:11:36.254726 4724 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bg7zk\" (UniqueName: \"kubernetes.io/projected/6f5065f2-e2e5-4b43-b548-c37deee52892-kube-api-access-bg7zk\") on node \"crc\" DevicePath \"\"" Oct 02 13:11:36 crc kubenswrapper[4724]: I1002 13:11:36.481779 4724 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-index-4st9w"] Oct 02 13:11:36 crc kubenswrapper[4724]: W1002 13:11:36.491887 4724 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod29c00177_6564_4e27_a1de_8f60e8dfc89c.slice/crio-adde2ce626adcc3709b29ca9283d2da2ddd958dce4a49fa2ccd2ca81663744ac WatchSource:0}: Error finding container adde2ce626adcc3709b29ca9283d2da2ddd958dce4a49fa2ccd2ca81663744ac: Status 404 returned error can't find the container with id adde2ce626adcc3709b29ca9283d2da2ddd958dce4a49fa2ccd2ca81663744ac Oct 02 13:11:36 crc kubenswrapper[4724]: I1002 13:11:36.598060 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-index-4st9w" event={"ID":"29c00177-6564-4e27-a1de-8f60e8dfc89c","Type":"ContainerStarted","Data":"adde2ce626adcc3709b29ca9283d2da2ddd958dce4a49fa2ccd2ca81663744ac"} Oct 02 13:11:36 crc kubenswrapper[4724]: I1002 13:11:36.602732 4724 generic.go:334] "Generic (PLEG): container finished" podID="6f5065f2-e2e5-4b43-b548-c37deee52892" containerID="862e462ed70f58e36c215ada274d685d330cb398c5a3a49badad5d561d1db086" exitCode=0 Oct 02 13:11:36 crc kubenswrapper[4724]: I1002 13:11:36.602782 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-index-c2vnt" event={"ID":"6f5065f2-e2e5-4b43-b548-c37deee52892","Type":"ContainerDied","Data":"862e462ed70f58e36c215ada274d685d330cb398c5a3a49badad5d561d1db086"} Oct 02 13:11:36 crc kubenswrapper[4724]: I1002 13:11:36.602797 4724 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-index-c2vnt" Oct 02 13:11:36 crc kubenswrapper[4724]: I1002 13:11:36.602817 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-index-c2vnt" event={"ID":"6f5065f2-e2e5-4b43-b548-c37deee52892","Type":"ContainerDied","Data":"8ba7445f7664f764d09cb16c2b7ededba119d121d83a007ef8144f650be438ec"} Oct 02 13:11:36 crc kubenswrapper[4724]: I1002 13:11:36.602841 4724 scope.go:117] "RemoveContainer" containerID="862e462ed70f58e36c215ada274d685d330cb398c5a3a49badad5d561d1db086" Oct 02 13:11:36 crc kubenswrapper[4724]: I1002 13:11:36.622096 4724 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/mariadb-operator-index-c2vnt"] Oct 02 13:11:36 crc kubenswrapper[4724]: I1002 13:11:36.625390 4724 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/mariadb-operator-index-c2vnt"] Oct 02 13:11:36 crc kubenswrapper[4724]: I1002 13:11:36.626869 4724 scope.go:117] "RemoveContainer" containerID="862e462ed70f58e36c215ada274d685d330cb398c5a3a49badad5d561d1db086" Oct 02 13:11:36 crc kubenswrapper[4724]: E1002 13:11:36.627254 4724 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"862e462ed70f58e36c215ada274d685d330cb398c5a3a49badad5d561d1db086\": container with ID starting with 862e462ed70f58e36c215ada274d685d330cb398c5a3a49badad5d561d1db086 not found: ID does not exist" containerID="862e462ed70f58e36c215ada274d685d330cb398c5a3a49badad5d561d1db086" Oct 02 13:11:36 crc kubenswrapper[4724]: I1002 13:11:36.627309 4724 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"862e462ed70f58e36c215ada274d685d330cb398c5a3a49badad5d561d1db086"} err="failed to get container status \"862e462ed70f58e36c215ada274d685d330cb398c5a3a49badad5d561d1db086\": rpc error: code = NotFound desc = could not find container \"862e462ed70f58e36c215ada274d685d330cb398c5a3a49badad5d561d1db086\": container with ID starting with 862e462ed70f58e36c215ada274d685d330cb398c5a3a49badad5d561d1db086 not found: ID does not exist" Oct 02 13:11:37 crc kubenswrapper[4724]: I1002 13:11:37.254701 4724 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-hrpd4" Oct 02 13:11:37 crc kubenswrapper[4724]: I1002 13:11:37.290277 4724 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-webhook-server-64bf5d555-sg7xz" Oct 02 13:11:37 crc kubenswrapper[4724]: I1002 13:11:37.609777 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-index-4st9w" event={"ID":"29c00177-6564-4e27-a1de-8f60e8dfc89c","Type":"ContainerStarted","Data":"142b6be39569640b4341a2ed99539e8164460087aeddcfe08bf9ef32882d35cf"} Oct 02 13:11:37 crc kubenswrapper[4724]: I1002 13:11:37.630106 4724 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/mariadb-operator-index-4st9w" podStartSLOduration=2.180656065 podStartE2EDuration="2.630082978s" podCreationTimestamp="2025-10-02 13:11:35 +0000 UTC" firstStartedPulling="2025-10-02 13:11:36.495698586 +0000 UTC m=+760.950457707" lastFinishedPulling="2025-10-02 13:11:36.945125499 +0000 UTC m=+761.399884620" observedRunningTime="2025-10-02 13:11:37.629947245 +0000 UTC m=+762.084706386" watchObservedRunningTime="2025-10-02 13:11:37.630082978 +0000 UTC m=+762.084842109" Oct 02 13:11:38 crc kubenswrapper[4724]: I1002 13:11:38.321507 4724 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6f5065f2-e2e5-4b43-b548-c37deee52892" path="/var/lib/kubelet/pods/6f5065f2-e2e5-4b43-b548-c37deee52892/volumes" Oct 02 13:11:46 crc kubenswrapper[4724]: I1002 13:11:46.100392 4724 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/mariadb-operator-index-4st9w" Oct 02 13:11:46 crc kubenswrapper[4724]: I1002 13:11:46.100962 4724 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/mariadb-operator-index-4st9w" Oct 02 13:11:46 crc kubenswrapper[4724]: I1002 13:11:46.140548 4724 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/mariadb-operator-index-4st9w" Oct 02 13:11:46 crc kubenswrapper[4724]: I1002 13:11:46.692767 4724 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/mariadb-operator-index-4st9w" Oct 02 13:11:47 crc kubenswrapper[4724]: I1002 13:11:47.404048 4724 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/5bfb89b7a15e902ec1ce651098a1cbdcb0a2281c38e30d9a342b952813zcwwv"] Oct 02 13:11:47 crc kubenswrapper[4724]: E1002 13:11:47.404259 4724 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6f5065f2-e2e5-4b43-b548-c37deee52892" containerName="registry-server" Oct 02 13:11:47 crc kubenswrapper[4724]: I1002 13:11:47.404271 4724 state_mem.go:107] "Deleted CPUSet assignment" podUID="6f5065f2-e2e5-4b43-b548-c37deee52892" containerName="registry-server" Oct 02 13:11:47 crc kubenswrapper[4724]: I1002 13:11:47.404379 4724 memory_manager.go:354] "RemoveStaleState removing state" podUID="6f5065f2-e2e5-4b43-b548-c37deee52892" containerName="registry-server" Oct 02 13:11:47 crc kubenswrapper[4724]: I1002 13:11:47.405177 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/5bfb89b7a15e902ec1ce651098a1cbdcb0a2281c38e30d9a342b952813zcwwv" Oct 02 13:11:47 crc kubenswrapper[4724]: I1002 13:11:47.407935 4724 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"default-dockercfg-j9mmh" Oct 02 13:11:47 crc kubenswrapper[4724]: I1002 13:11:47.415273 4724 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/5bfb89b7a15e902ec1ce651098a1cbdcb0a2281c38e30d9a342b952813zcwwv"] Oct 02 13:11:47 crc kubenswrapper[4724]: I1002 13:11:47.501731 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/cec254d8-8ae9-44e7-b7f8-40a87a42ca6c-bundle\") pod \"5bfb89b7a15e902ec1ce651098a1cbdcb0a2281c38e30d9a342b952813zcwwv\" (UID: \"cec254d8-8ae9-44e7-b7f8-40a87a42ca6c\") " pod="openstack-operators/5bfb89b7a15e902ec1ce651098a1cbdcb0a2281c38e30d9a342b952813zcwwv" Oct 02 13:11:47 crc kubenswrapper[4724]: I1002 13:11:47.501813 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/cec254d8-8ae9-44e7-b7f8-40a87a42ca6c-util\") pod \"5bfb89b7a15e902ec1ce651098a1cbdcb0a2281c38e30d9a342b952813zcwwv\" (UID: \"cec254d8-8ae9-44e7-b7f8-40a87a42ca6c\") " pod="openstack-operators/5bfb89b7a15e902ec1ce651098a1cbdcb0a2281c38e30d9a342b952813zcwwv" Oct 02 13:11:47 crc kubenswrapper[4724]: I1002 13:11:47.501862 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7snvg\" (UniqueName: \"kubernetes.io/projected/cec254d8-8ae9-44e7-b7f8-40a87a42ca6c-kube-api-access-7snvg\") pod \"5bfb89b7a15e902ec1ce651098a1cbdcb0a2281c38e30d9a342b952813zcwwv\" (UID: \"cec254d8-8ae9-44e7-b7f8-40a87a42ca6c\") " pod="openstack-operators/5bfb89b7a15e902ec1ce651098a1cbdcb0a2281c38e30d9a342b952813zcwwv" Oct 02 13:11:47 crc kubenswrapper[4724]: I1002 13:11:47.602902 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/cec254d8-8ae9-44e7-b7f8-40a87a42ca6c-bundle\") pod \"5bfb89b7a15e902ec1ce651098a1cbdcb0a2281c38e30d9a342b952813zcwwv\" (UID: \"cec254d8-8ae9-44e7-b7f8-40a87a42ca6c\") " pod="openstack-operators/5bfb89b7a15e902ec1ce651098a1cbdcb0a2281c38e30d9a342b952813zcwwv" Oct 02 13:11:47 crc kubenswrapper[4724]: I1002 13:11:47.602967 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/cec254d8-8ae9-44e7-b7f8-40a87a42ca6c-util\") pod \"5bfb89b7a15e902ec1ce651098a1cbdcb0a2281c38e30d9a342b952813zcwwv\" (UID: \"cec254d8-8ae9-44e7-b7f8-40a87a42ca6c\") " pod="openstack-operators/5bfb89b7a15e902ec1ce651098a1cbdcb0a2281c38e30d9a342b952813zcwwv" Oct 02 13:11:47 crc kubenswrapper[4724]: I1002 13:11:47.603016 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7snvg\" (UniqueName: \"kubernetes.io/projected/cec254d8-8ae9-44e7-b7f8-40a87a42ca6c-kube-api-access-7snvg\") pod \"5bfb89b7a15e902ec1ce651098a1cbdcb0a2281c38e30d9a342b952813zcwwv\" (UID: \"cec254d8-8ae9-44e7-b7f8-40a87a42ca6c\") " pod="openstack-operators/5bfb89b7a15e902ec1ce651098a1cbdcb0a2281c38e30d9a342b952813zcwwv" Oct 02 13:11:47 crc kubenswrapper[4724]: I1002 13:11:47.603601 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/cec254d8-8ae9-44e7-b7f8-40a87a42ca6c-util\") pod \"5bfb89b7a15e902ec1ce651098a1cbdcb0a2281c38e30d9a342b952813zcwwv\" (UID: \"cec254d8-8ae9-44e7-b7f8-40a87a42ca6c\") " pod="openstack-operators/5bfb89b7a15e902ec1ce651098a1cbdcb0a2281c38e30d9a342b952813zcwwv" Oct 02 13:11:47 crc kubenswrapper[4724]: I1002 13:11:47.603745 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/cec254d8-8ae9-44e7-b7f8-40a87a42ca6c-bundle\") pod \"5bfb89b7a15e902ec1ce651098a1cbdcb0a2281c38e30d9a342b952813zcwwv\" (UID: \"cec254d8-8ae9-44e7-b7f8-40a87a42ca6c\") " pod="openstack-operators/5bfb89b7a15e902ec1ce651098a1cbdcb0a2281c38e30d9a342b952813zcwwv" Oct 02 13:11:47 crc kubenswrapper[4724]: I1002 13:11:47.626910 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7snvg\" (UniqueName: \"kubernetes.io/projected/cec254d8-8ae9-44e7-b7f8-40a87a42ca6c-kube-api-access-7snvg\") pod \"5bfb89b7a15e902ec1ce651098a1cbdcb0a2281c38e30d9a342b952813zcwwv\" (UID: \"cec254d8-8ae9-44e7-b7f8-40a87a42ca6c\") " pod="openstack-operators/5bfb89b7a15e902ec1ce651098a1cbdcb0a2281c38e30d9a342b952813zcwwv" Oct 02 13:11:47 crc kubenswrapper[4724]: I1002 13:11:47.724202 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/5bfb89b7a15e902ec1ce651098a1cbdcb0a2281c38e30d9a342b952813zcwwv" Oct 02 13:11:48 crc kubenswrapper[4724]: I1002 13:11:48.106296 4724 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/5bfb89b7a15e902ec1ce651098a1cbdcb0a2281c38e30d9a342b952813zcwwv"] Oct 02 13:11:48 crc kubenswrapper[4724]: I1002 13:11:48.672433 4724 generic.go:334] "Generic (PLEG): container finished" podID="cec254d8-8ae9-44e7-b7f8-40a87a42ca6c" containerID="31a567bcec0482c28e4a1e3b607d803165d292b3d19a24b31c5002467951c161" exitCode=0 Oct 02 13:11:48 crc kubenswrapper[4724]: I1002 13:11:48.672594 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/5bfb89b7a15e902ec1ce651098a1cbdcb0a2281c38e30d9a342b952813zcwwv" event={"ID":"cec254d8-8ae9-44e7-b7f8-40a87a42ca6c","Type":"ContainerDied","Data":"31a567bcec0482c28e4a1e3b607d803165d292b3d19a24b31c5002467951c161"} Oct 02 13:11:48 crc kubenswrapper[4724]: I1002 13:11:48.672767 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/5bfb89b7a15e902ec1ce651098a1cbdcb0a2281c38e30d9a342b952813zcwwv" event={"ID":"cec254d8-8ae9-44e7-b7f8-40a87a42ca6c","Type":"ContainerStarted","Data":"fbf2baf5cc23c3a2494d3733596bc99928f03f22442bb83e5cd99677bc60e95b"} Oct 02 13:11:53 crc kubenswrapper[4724]: I1002 13:11:53.701880 4724 generic.go:334] "Generic (PLEG): container finished" podID="cec254d8-8ae9-44e7-b7f8-40a87a42ca6c" containerID="cb78a9f415d4be23bc1536e416b3a0c6eaa3671e5aea76e35e6172ef5e6ef2d9" exitCode=0 Oct 02 13:11:53 crc kubenswrapper[4724]: I1002 13:11:53.701952 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/5bfb89b7a15e902ec1ce651098a1cbdcb0a2281c38e30d9a342b952813zcwwv" event={"ID":"cec254d8-8ae9-44e7-b7f8-40a87a42ca6c","Type":"ContainerDied","Data":"cb78a9f415d4be23bc1536e416b3a0c6eaa3671e5aea76e35e6172ef5e6ef2d9"} Oct 02 13:11:54 crc kubenswrapper[4724]: I1002 13:11:54.709892 4724 generic.go:334] "Generic (PLEG): container finished" podID="cec254d8-8ae9-44e7-b7f8-40a87a42ca6c" containerID="ca993e450a30892e915f1cabf2fedff39aa3310e7af9e08c3c1c772c35036246" exitCode=0 Oct 02 13:11:54 crc kubenswrapper[4724]: I1002 13:11:54.709947 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/5bfb89b7a15e902ec1ce651098a1cbdcb0a2281c38e30d9a342b952813zcwwv" event={"ID":"cec254d8-8ae9-44e7-b7f8-40a87a42ca6c","Type":"ContainerDied","Data":"ca993e450a30892e915f1cabf2fedff39aa3310e7af9e08c3c1c772c35036246"} Oct 02 13:11:55 crc kubenswrapper[4724]: I1002 13:11:55.457740 4724 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-5nzcv"] Oct 02 13:11:55 crc kubenswrapper[4724]: I1002 13:11:55.458766 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-5nzcv" Oct 02 13:11:55 crc kubenswrapper[4724]: I1002 13:11:55.472486 4724 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-5nzcv"] Oct 02 13:11:55 crc kubenswrapper[4724]: I1002 13:11:55.514407 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9810e031-6fc3-49eb-98c9-300541a267dc-utilities\") pod \"community-operators-5nzcv\" (UID: \"9810e031-6fc3-49eb-98c9-300541a267dc\") " pod="openshift-marketplace/community-operators-5nzcv" Oct 02 13:11:55 crc kubenswrapper[4724]: I1002 13:11:55.514468 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9810e031-6fc3-49eb-98c9-300541a267dc-catalog-content\") pod \"community-operators-5nzcv\" (UID: \"9810e031-6fc3-49eb-98c9-300541a267dc\") " pod="openshift-marketplace/community-operators-5nzcv" Oct 02 13:11:55 crc kubenswrapper[4724]: I1002 13:11:55.514490 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8kp76\" (UniqueName: \"kubernetes.io/projected/9810e031-6fc3-49eb-98c9-300541a267dc-kube-api-access-8kp76\") pod \"community-operators-5nzcv\" (UID: \"9810e031-6fc3-49eb-98c9-300541a267dc\") " pod="openshift-marketplace/community-operators-5nzcv" Oct 02 13:11:55 crc kubenswrapper[4724]: I1002 13:11:55.615475 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9810e031-6fc3-49eb-98c9-300541a267dc-utilities\") pod \"community-operators-5nzcv\" (UID: \"9810e031-6fc3-49eb-98c9-300541a267dc\") " pod="openshift-marketplace/community-operators-5nzcv" Oct 02 13:11:55 crc kubenswrapper[4724]: I1002 13:11:55.615738 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9810e031-6fc3-49eb-98c9-300541a267dc-catalog-content\") pod \"community-operators-5nzcv\" (UID: \"9810e031-6fc3-49eb-98c9-300541a267dc\") " pod="openshift-marketplace/community-operators-5nzcv" Oct 02 13:11:55 crc kubenswrapper[4724]: I1002 13:11:55.615760 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8kp76\" (UniqueName: \"kubernetes.io/projected/9810e031-6fc3-49eb-98c9-300541a267dc-kube-api-access-8kp76\") pod \"community-operators-5nzcv\" (UID: \"9810e031-6fc3-49eb-98c9-300541a267dc\") " pod="openshift-marketplace/community-operators-5nzcv" Oct 02 13:11:55 crc kubenswrapper[4724]: I1002 13:11:55.616182 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9810e031-6fc3-49eb-98c9-300541a267dc-utilities\") pod \"community-operators-5nzcv\" (UID: \"9810e031-6fc3-49eb-98c9-300541a267dc\") " pod="openshift-marketplace/community-operators-5nzcv" Oct 02 13:11:55 crc kubenswrapper[4724]: I1002 13:11:55.616245 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9810e031-6fc3-49eb-98c9-300541a267dc-catalog-content\") pod \"community-operators-5nzcv\" (UID: \"9810e031-6fc3-49eb-98c9-300541a267dc\") " pod="openshift-marketplace/community-operators-5nzcv" Oct 02 13:11:55 crc kubenswrapper[4724]: I1002 13:11:55.638664 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8kp76\" (UniqueName: \"kubernetes.io/projected/9810e031-6fc3-49eb-98c9-300541a267dc-kube-api-access-8kp76\") pod \"community-operators-5nzcv\" (UID: \"9810e031-6fc3-49eb-98c9-300541a267dc\") " pod="openshift-marketplace/community-operators-5nzcv" Oct 02 13:11:55 crc kubenswrapper[4724]: I1002 13:11:55.782168 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-5nzcv" Oct 02 13:11:56 crc kubenswrapper[4724]: I1002 13:11:56.169009 4724 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/5bfb89b7a15e902ec1ce651098a1cbdcb0a2281c38e30d9a342b952813zcwwv" Oct 02 13:11:56 crc kubenswrapper[4724]: I1002 13:11:56.223986 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/cec254d8-8ae9-44e7-b7f8-40a87a42ca6c-bundle\") pod \"cec254d8-8ae9-44e7-b7f8-40a87a42ca6c\" (UID: \"cec254d8-8ae9-44e7-b7f8-40a87a42ca6c\") " Oct 02 13:11:56 crc kubenswrapper[4724]: I1002 13:11:56.224210 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/cec254d8-8ae9-44e7-b7f8-40a87a42ca6c-util\") pod \"cec254d8-8ae9-44e7-b7f8-40a87a42ca6c\" (UID: \"cec254d8-8ae9-44e7-b7f8-40a87a42ca6c\") " Oct 02 13:11:56 crc kubenswrapper[4724]: I1002 13:11:56.224261 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7snvg\" (UniqueName: \"kubernetes.io/projected/cec254d8-8ae9-44e7-b7f8-40a87a42ca6c-kube-api-access-7snvg\") pod \"cec254d8-8ae9-44e7-b7f8-40a87a42ca6c\" (UID: \"cec254d8-8ae9-44e7-b7f8-40a87a42ca6c\") " Oct 02 13:11:56 crc kubenswrapper[4724]: I1002 13:11:56.225897 4724 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cec254d8-8ae9-44e7-b7f8-40a87a42ca6c-bundle" (OuterVolumeSpecName: "bundle") pod "cec254d8-8ae9-44e7-b7f8-40a87a42ca6c" (UID: "cec254d8-8ae9-44e7-b7f8-40a87a42ca6c"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 13:11:56 crc kubenswrapper[4724]: I1002 13:11:56.229241 4724 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cec254d8-8ae9-44e7-b7f8-40a87a42ca6c-kube-api-access-7snvg" (OuterVolumeSpecName: "kube-api-access-7snvg") pod "cec254d8-8ae9-44e7-b7f8-40a87a42ca6c" (UID: "cec254d8-8ae9-44e7-b7f8-40a87a42ca6c"). InnerVolumeSpecName "kube-api-access-7snvg". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 13:11:56 crc kubenswrapper[4724]: I1002 13:11:56.235260 4724 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cec254d8-8ae9-44e7-b7f8-40a87a42ca6c-util" (OuterVolumeSpecName: "util") pod "cec254d8-8ae9-44e7-b7f8-40a87a42ca6c" (UID: "cec254d8-8ae9-44e7-b7f8-40a87a42ca6c"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 13:11:56 crc kubenswrapper[4724]: I1002 13:11:56.269802 4724 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-5nzcv"] Oct 02 13:11:56 crc kubenswrapper[4724]: W1002 13:11:56.273286 4724 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9810e031_6fc3_49eb_98c9_300541a267dc.slice/crio-1fe7de36714883b454cbe1f1cd729aa4581bfc8385601541d10484cdee115eee WatchSource:0}: Error finding container 1fe7de36714883b454cbe1f1cd729aa4581bfc8385601541d10484cdee115eee: Status 404 returned error can't find the container with id 1fe7de36714883b454cbe1f1cd729aa4581bfc8385601541d10484cdee115eee Oct 02 13:11:56 crc kubenswrapper[4724]: I1002 13:11:56.325255 4724 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/cec254d8-8ae9-44e7-b7f8-40a87a42ca6c-util\") on node \"crc\" DevicePath \"\"" Oct 02 13:11:56 crc kubenswrapper[4724]: I1002 13:11:56.325287 4724 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7snvg\" (UniqueName: \"kubernetes.io/projected/cec254d8-8ae9-44e7-b7f8-40a87a42ca6c-kube-api-access-7snvg\") on node \"crc\" DevicePath \"\"" Oct 02 13:11:56 crc kubenswrapper[4724]: I1002 13:11:56.325326 4724 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/cec254d8-8ae9-44e7-b7f8-40a87a42ca6c-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 13:11:56 crc kubenswrapper[4724]: I1002 13:11:56.726499 4724 generic.go:334] "Generic (PLEG): container finished" podID="9810e031-6fc3-49eb-98c9-300541a267dc" containerID="40bb92dedb9e588a4a5c6d93940219c264f5fbaed571782918880bf59c667f83" exitCode=0 Oct 02 13:11:56 crc kubenswrapper[4724]: I1002 13:11:56.726605 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5nzcv" event={"ID":"9810e031-6fc3-49eb-98c9-300541a267dc","Type":"ContainerDied","Data":"40bb92dedb9e588a4a5c6d93940219c264f5fbaed571782918880bf59c667f83"} Oct 02 13:11:56 crc kubenswrapper[4724]: I1002 13:11:56.726955 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5nzcv" event={"ID":"9810e031-6fc3-49eb-98c9-300541a267dc","Type":"ContainerStarted","Data":"1fe7de36714883b454cbe1f1cd729aa4581bfc8385601541d10484cdee115eee"} Oct 02 13:11:56 crc kubenswrapper[4724]: I1002 13:11:56.732639 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/5bfb89b7a15e902ec1ce651098a1cbdcb0a2281c38e30d9a342b952813zcwwv" event={"ID":"cec254d8-8ae9-44e7-b7f8-40a87a42ca6c","Type":"ContainerDied","Data":"fbf2baf5cc23c3a2494d3733596bc99928f03f22442bb83e5cd99677bc60e95b"} Oct 02 13:11:56 crc kubenswrapper[4724]: I1002 13:11:56.732786 4724 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fbf2baf5cc23c3a2494d3733596bc99928f03f22442bb83e5cd99677bc60e95b" Oct 02 13:11:56 crc kubenswrapper[4724]: I1002 13:11:56.732702 4724 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/5bfb89b7a15e902ec1ce651098a1cbdcb0a2281c38e30d9a342b952813zcwwv" Oct 02 13:11:58 crc kubenswrapper[4724]: I1002 13:11:58.742697 4724 generic.go:334] "Generic (PLEG): container finished" podID="9810e031-6fc3-49eb-98c9-300541a267dc" containerID="99aa9bf85086dedd44ac61e67f5ab107f015efbf9ae0fee45079e598cb902fec" exitCode=0 Oct 02 13:11:58 crc kubenswrapper[4724]: I1002 13:11:58.742747 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5nzcv" event={"ID":"9810e031-6fc3-49eb-98c9-300541a267dc","Type":"ContainerDied","Data":"99aa9bf85086dedd44ac61e67f5ab107f015efbf9ae0fee45079e598cb902fec"} Oct 02 13:11:59 crc kubenswrapper[4724]: I1002 13:11:59.750915 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5nzcv" event={"ID":"9810e031-6fc3-49eb-98c9-300541a267dc","Type":"ContainerStarted","Data":"c4c60973572ccd17a92b100d69968eaf04ff7726b392e278e403eb989504bf2a"} Oct 02 13:11:59 crc kubenswrapper[4724]: I1002 13:11:59.773438 4724 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-5nzcv" podStartSLOduration=2.034274303 podStartE2EDuration="4.773423495s" podCreationTimestamp="2025-10-02 13:11:55 +0000 UTC" firstStartedPulling="2025-10-02 13:11:56.728495114 +0000 UTC m=+781.183254235" lastFinishedPulling="2025-10-02 13:11:59.467644306 +0000 UTC m=+783.922403427" observedRunningTime="2025-10-02 13:11:59.771217018 +0000 UTC m=+784.225976149" watchObservedRunningTime="2025-10-02 13:11:59.773423495 +0000 UTC m=+784.228182616" Oct 02 13:11:59 crc kubenswrapper[4724]: I1002 13:11:59.813418 4724 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-7cdf88d46-nvsmc"] Oct 02 13:11:59 crc kubenswrapper[4724]: E1002 13:11:59.813724 4724 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cec254d8-8ae9-44e7-b7f8-40a87a42ca6c" containerName="pull" Oct 02 13:11:59 crc kubenswrapper[4724]: I1002 13:11:59.813748 4724 state_mem.go:107] "Deleted CPUSet assignment" podUID="cec254d8-8ae9-44e7-b7f8-40a87a42ca6c" containerName="pull" Oct 02 13:11:59 crc kubenswrapper[4724]: E1002 13:11:59.813776 4724 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cec254d8-8ae9-44e7-b7f8-40a87a42ca6c" containerName="util" Oct 02 13:11:59 crc kubenswrapper[4724]: I1002 13:11:59.813785 4724 state_mem.go:107] "Deleted CPUSet assignment" podUID="cec254d8-8ae9-44e7-b7f8-40a87a42ca6c" containerName="util" Oct 02 13:11:59 crc kubenswrapper[4724]: E1002 13:11:59.813803 4724 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cec254d8-8ae9-44e7-b7f8-40a87a42ca6c" containerName="extract" Oct 02 13:11:59 crc kubenswrapper[4724]: I1002 13:11:59.813811 4724 state_mem.go:107] "Deleted CPUSet assignment" podUID="cec254d8-8ae9-44e7-b7f8-40a87a42ca6c" containerName="extract" Oct 02 13:11:59 crc kubenswrapper[4724]: I1002 13:11:59.813956 4724 memory_manager.go:354] "RemoveStaleState removing state" podUID="cec254d8-8ae9-44e7-b7f8-40a87a42ca6c" containerName="extract" Oct 02 13:11:59 crc kubenswrapper[4724]: I1002 13:11:59.814721 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-7cdf88d46-nvsmc" Oct 02 13:11:59 crc kubenswrapper[4724]: I1002 13:11:59.817004 4724 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"webhook-server-cert" Oct 02 13:11:59 crc kubenswrapper[4724]: I1002 13:11:59.817063 4724 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"mariadb-operator-controller-manager-dockercfg-kbfl8" Oct 02 13:11:59 crc kubenswrapper[4724]: I1002 13:11:59.817068 4724 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"mariadb-operator-controller-manager-service-cert" Oct 02 13:11:59 crc kubenswrapper[4724]: I1002 13:11:59.829970 4724 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-7cdf88d46-nvsmc"] Oct 02 13:11:59 crc kubenswrapper[4724]: I1002 13:11:59.873663 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/58c544da-0083-4968-a2bd-75944651e5ea-apiservice-cert\") pod \"mariadb-operator-controller-manager-7cdf88d46-nvsmc\" (UID: \"58c544da-0083-4968-a2bd-75944651e5ea\") " pod="openstack-operators/mariadb-operator-controller-manager-7cdf88d46-nvsmc" Oct 02 13:11:59 crc kubenswrapper[4724]: I1002 13:11:59.873757 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nl2sk\" (UniqueName: \"kubernetes.io/projected/58c544da-0083-4968-a2bd-75944651e5ea-kube-api-access-nl2sk\") pod \"mariadb-operator-controller-manager-7cdf88d46-nvsmc\" (UID: \"58c544da-0083-4968-a2bd-75944651e5ea\") " pod="openstack-operators/mariadb-operator-controller-manager-7cdf88d46-nvsmc" Oct 02 13:11:59 crc kubenswrapper[4724]: I1002 13:11:59.873827 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/58c544da-0083-4968-a2bd-75944651e5ea-webhook-cert\") pod \"mariadb-operator-controller-manager-7cdf88d46-nvsmc\" (UID: \"58c544da-0083-4968-a2bd-75944651e5ea\") " pod="openstack-operators/mariadb-operator-controller-manager-7cdf88d46-nvsmc" Oct 02 13:11:59 crc kubenswrapper[4724]: I1002 13:11:59.974642 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/58c544da-0083-4968-a2bd-75944651e5ea-apiservice-cert\") pod \"mariadb-operator-controller-manager-7cdf88d46-nvsmc\" (UID: \"58c544da-0083-4968-a2bd-75944651e5ea\") " pod="openstack-operators/mariadb-operator-controller-manager-7cdf88d46-nvsmc" Oct 02 13:11:59 crc kubenswrapper[4724]: I1002 13:11:59.974749 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nl2sk\" (UniqueName: \"kubernetes.io/projected/58c544da-0083-4968-a2bd-75944651e5ea-kube-api-access-nl2sk\") pod \"mariadb-operator-controller-manager-7cdf88d46-nvsmc\" (UID: \"58c544da-0083-4968-a2bd-75944651e5ea\") " pod="openstack-operators/mariadb-operator-controller-manager-7cdf88d46-nvsmc" Oct 02 13:11:59 crc kubenswrapper[4724]: I1002 13:11:59.974810 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/58c544da-0083-4968-a2bd-75944651e5ea-webhook-cert\") pod \"mariadb-operator-controller-manager-7cdf88d46-nvsmc\" (UID: \"58c544da-0083-4968-a2bd-75944651e5ea\") " pod="openstack-operators/mariadb-operator-controller-manager-7cdf88d46-nvsmc" Oct 02 13:11:59 crc kubenswrapper[4724]: I1002 13:11:59.979556 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/58c544da-0083-4968-a2bd-75944651e5ea-apiservice-cert\") pod \"mariadb-operator-controller-manager-7cdf88d46-nvsmc\" (UID: \"58c544da-0083-4968-a2bd-75944651e5ea\") " pod="openstack-operators/mariadb-operator-controller-manager-7cdf88d46-nvsmc" Oct 02 13:11:59 crc kubenswrapper[4724]: I1002 13:11:59.979619 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/58c544da-0083-4968-a2bd-75944651e5ea-webhook-cert\") pod \"mariadb-operator-controller-manager-7cdf88d46-nvsmc\" (UID: \"58c544da-0083-4968-a2bd-75944651e5ea\") " pod="openstack-operators/mariadb-operator-controller-manager-7cdf88d46-nvsmc" Oct 02 13:11:59 crc kubenswrapper[4724]: I1002 13:11:59.991994 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nl2sk\" (UniqueName: \"kubernetes.io/projected/58c544da-0083-4968-a2bd-75944651e5ea-kube-api-access-nl2sk\") pod \"mariadb-operator-controller-manager-7cdf88d46-nvsmc\" (UID: \"58c544da-0083-4968-a2bd-75944651e5ea\") " pod="openstack-operators/mariadb-operator-controller-manager-7cdf88d46-nvsmc" Oct 02 13:12:00 crc kubenswrapper[4724]: I1002 13:12:00.134026 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-7cdf88d46-nvsmc" Oct 02 13:12:00 crc kubenswrapper[4724]: I1002 13:12:00.590829 4724 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-7cdf88d46-nvsmc"] Oct 02 13:12:00 crc kubenswrapper[4724]: W1002 13:12:00.595829 4724 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod58c544da_0083_4968_a2bd_75944651e5ea.slice/crio-08c5cca332118077e89154cdaf19af429f6d2f15d693b4cf7d4c6897e94ab8da WatchSource:0}: Error finding container 08c5cca332118077e89154cdaf19af429f6d2f15d693b4cf7d4c6897e94ab8da: Status 404 returned error can't find the container with id 08c5cca332118077e89154cdaf19af429f6d2f15d693b4cf7d4c6897e94ab8da Oct 02 13:12:00 crc kubenswrapper[4724]: I1002 13:12:00.757367 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-7cdf88d46-nvsmc" event={"ID":"58c544da-0083-4968-a2bd-75944651e5ea","Type":"ContainerStarted","Data":"08c5cca332118077e89154cdaf19af429f6d2f15d693b4cf7d4c6897e94ab8da"} Oct 02 13:12:04 crc kubenswrapper[4724]: I1002 13:12:04.733745 4724 patch_prober.go:28] interesting pod/machine-config-daemon-74k4t container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 02 13:12:04 crc kubenswrapper[4724]: I1002 13:12:04.734033 4724 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-74k4t" podUID="f6090eaa-c182-4788-950c-16352c271233" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 02 13:12:04 crc kubenswrapper[4724]: I1002 13:12:04.783282 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-7cdf88d46-nvsmc" event={"ID":"58c544da-0083-4968-a2bd-75944651e5ea","Type":"ContainerStarted","Data":"6ae217afd8c801364d3f3fd78deb1a80ed03cf1313250dad8ac6c2a9b836bbe0"} Oct 02 13:12:05 crc kubenswrapper[4724]: I1002 13:12:05.783280 4724 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-5nzcv" Oct 02 13:12:05 crc kubenswrapper[4724]: I1002 13:12:05.783548 4724 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-5nzcv" Oct 02 13:12:05 crc kubenswrapper[4724]: I1002 13:12:05.835290 4724 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-5nzcv" Oct 02 13:12:06 crc kubenswrapper[4724]: I1002 13:12:06.837414 4724 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-5nzcv" Oct 02 13:12:07 crc kubenswrapper[4724]: I1002 13:12:07.802027 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-7cdf88d46-nvsmc" event={"ID":"58c544da-0083-4968-a2bd-75944651e5ea","Type":"ContainerStarted","Data":"38bcb79c1ec34ec25508dde8b0844e27c462bb722953ca351ebac730a3508572"} Oct 02 13:12:07 crc kubenswrapper[4724]: I1002 13:12:07.802362 4724 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/mariadb-operator-controller-manager-7cdf88d46-nvsmc" Oct 02 13:12:07 crc kubenswrapper[4724]: I1002 13:12:07.823789 4724 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/mariadb-operator-controller-manager-7cdf88d46-nvsmc" podStartSLOduration=2.210783149 podStartE2EDuration="8.823759395s" podCreationTimestamp="2025-10-02 13:11:59 +0000 UTC" firstStartedPulling="2025-10-02 13:12:00.598609452 +0000 UTC m=+785.053368573" lastFinishedPulling="2025-10-02 13:12:07.211585698 +0000 UTC m=+791.666344819" observedRunningTime="2025-10-02 13:12:07.820054098 +0000 UTC m=+792.274813279" watchObservedRunningTime="2025-10-02 13:12:07.823759395 +0000 UTC m=+792.278518556" Oct 02 13:12:08 crc kubenswrapper[4724]: I1002 13:12:08.633025 4724 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-5nzcv"] Oct 02 13:12:08 crc kubenswrapper[4724]: I1002 13:12:08.810103 4724 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-5nzcv" podUID="9810e031-6fc3-49eb-98c9-300541a267dc" containerName="registry-server" containerID="cri-o://c4c60973572ccd17a92b100d69968eaf04ff7726b392e278e403eb989504bf2a" gracePeriod=2 Oct 02 13:12:09 crc kubenswrapper[4724]: I1002 13:12:09.242242 4724 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-spbdk"] Oct 02 13:12:09 crc kubenswrapper[4724]: I1002 13:12:09.243823 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-spbdk" Oct 02 13:12:09 crc kubenswrapper[4724]: I1002 13:12:09.263869 4724 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-spbdk"] Oct 02 13:12:09 crc kubenswrapper[4724]: I1002 13:12:09.305562 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sd89z\" (UniqueName: \"kubernetes.io/projected/a0f88112-a644-4c18-a7a9-0b52131640d5-kube-api-access-sd89z\") pod \"redhat-operators-spbdk\" (UID: \"a0f88112-a644-4c18-a7a9-0b52131640d5\") " pod="openshift-marketplace/redhat-operators-spbdk" Oct 02 13:12:09 crc kubenswrapper[4724]: I1002 13:12:09.305624 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a0f88112-a644-4c18-a7a9-0b52131640d5-utilities\") pod \"redhat-operators-spbdk\" (UID: \"a0f88112-a644-4c18-a7a9-0b52131640d5\") " pod="openshift-marketplace/redhat-operators-spbdk" Oct 02 13:12:09 crc kubenswrapper[4724]: I1002 13:12:09.305710 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a0f88112-a644-4c18-a7a9-0b52131640d5-catalog-content\") pod \"redhat-operators-spbdk\" (UID: \"a0f88112-a644-4c18-a7a9-0b52131640d5\") " pod="openshift-marketplace/redhat-operators-spbdk" Oct 02 13:12:09 crc kubenswrapper[4724]: I1002 13:12:09.407126 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a0f88112-a644-4c18-a7a9-0b52131640d5-catalog-content\") pod \"redhat-operators-spbdk\" (UID: \"a0f88112-a644-4c18-a7a9-0b52131640d5\") " pod="openshift-marketplace/redhat-operators-spbdk" Oct 02 13:12:09 crc kubenswrapper[4724]: I1002 13:12:09.407319 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sd89z\" (UniqueName: \"kubernetes.io/projected/a0f88112-a644-4c18-a7a9-0b52131640d5-kube-api-access-sd89z\") pod \"redhat-operators-spbdk\" (UID: \"a0f88112-a644-4c18-a7a9-0b52131640d5\") " pod="openshift-marketplace/redhat-operators-spbdk" Oct 02 13:12:09 crc kubenswrapper[4724]: I1002 13:12:09.407343 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a0f88112-a644-4c18-a7a9-0b52131640d5-utilities\") pod \"redhat-operators-spbdk\" (UID: \"a0f88112-a644-4c18-a7a9-0b52131640d5\") " pod="openshift-marketplace/redhat-operators-spbdk" Oct 02 13:12:09 crc kubenswrapper[4724]: I1002 13:12:09.408218 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a0f88112-a644-4c18-a7a9-0b52131640d5-catalog-content\") pod \"redhat-operators-spbdk\" (UID: \"a0f88112-a644-4c18-a7a9-0b52131640d5\") " pod="openshift-marketplace/redhat-operators-spbdk" Oct 02 13:12:09 crc kubenswrapper[4724]: I1002 13:12:09.408323 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a0f88112-a644-4c18-a7a9-0b52131640d5-utilities\") pod \"redhat-operators-spbdk\" (UID: \"a0f88112-a644-4c18-a7a9-0b52131640d5\") " pod="openshift-marketplace/redhat-operators-spbdk" Oct 02 13:12:09 crc kubenswrapper[4724]: I1002 13:12:09.434602 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sd89z\" (UniqueName: \"kubernetes.io/projected/a0f88112-a644-4c18-a7a9-0b52131640d5-kube-api-access-sd89z\") pod \"redhat-operators-spbdk\" (UID: \"a0f88112-a644-4c18-a7a9-0b52131640d5\") " pod="openshift-marketplace/redhat-operators-spbdk" Oct 02 13:12:09 crc kubenswrapper[4724]: I1002 13:12:09.565398 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-spbdk" Oct 02 13:12:09 crc kubenswrapper[4724]: I1002 13:12:09.991692 4724 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-spbdk"] Oct 02 13:12:10 crc kubenswrapper[4724]: I1002 13:12:10.139077 4724 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/mariadb-operator-controller-manager-7cdf88d46-nvsmc" Oct 02 13:12:10 crc kubenswrapper[4724]: I1002 13:12:10.596471 4724 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-5nzcv" Oct 02 13:12:10 crc kubenswrapper[4724]: I1002 13:12:10.730465 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9810e031-6fc3-49eb-98c9-300541a267dc-catalog-content\") pod \"9810e031-6fc3-49eb-98c9-300541a267dc\" (UID: \"9810e031-6fc3-49eb-98c9-300541a267dc\") " Oct 02 13:12:10 crc kubenswrapper[4724]: I1002 13:12:10.730646 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8kp76\" (UniqueName: \"kubernetes.io/projected/9810e031-6fc3-49eb-98c9-300541a267dc-kube-api-access-8kp76\") pod \"9810e031-6fc3-49eb-98c9-300541a267dc\" (UID: \"9810e031-6fc3-49eb-98c9-300541a267dc\") " Oct 02 13:12:10 crc kubenswrapper[4724]: I1002 13:12:10.730733 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9810e031-6fc3-49eb-98c9-300541a267dc-utilities\") pod \"9810e031-6fc3-49eb-98c9-300541a267dc\" (UID: \"9810e031-6fc3-49eb-98c9-300541a267dc\") " Oct 02 13:12:10 crc kubenswrapper[4724]: I1002 13:12:10.731473 4724 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9810e031-6fc3-49eb-98c9-300541a267dc-utilities" (OuterVolumeSpecName: "utilities") pod "9810e031-6fc3-49eb-98c9-300541a267dc" (UID: "9810e031-6fc3-49eb-98c9-300541a267dc"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 13:12:10 crc kubenswrapper[4724]: I1002 13:12:10.735958 4724 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9810e031-6fc3-49eb-98c9-300541a267dc-kube-api-access-8kp76" (OuterVolumeSpecName: "kube-api-access-8kp76") pod "9810e031-6fc3-49eb-98c9-300541a267dc" (UID: "9810e031-6fc3-49eb-98c9-300541a267dc"). InnerVolumeSpecName "kube-api-access-8kp76". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 13:12:10 crc kubenswrapper[4724]: I1002 13:12:10.777499 4724 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9810e031-6fc3-49eb-98c9-300541a267dc-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "9810e031-6fc3-49eb-98c9-300541a267dc" (UID: "9810e031-6fc3-49eb-98c9-300541a267dc"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 13:12:10 crc kubenswrapper[4724]: I1002 13:12:10.820588 4724 generic.go:334] "Generic (PLEG): container finished" podID="a0f88112-a644-4c18-a7a9-0b52131640d5" containerID="fc019c31d2c7528ea0b7bdf0b5c05a42d51652e49429f77c5af2f7de6ce6b4aa" exitCode=0 Oct 02 13:12:10 crc kubenswrapper[4724]: I1002 13:12:10.820655 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-spbdk" event={"ID":"a0f88112-a644-4c18-a7a9-0b52131640d5","Type":"ContainerDied","Data":"fc019c31d2c7528ea0b7bdf0b5c05a42d51652e49429f77c5af2f7de6ce6b4aa"} Oct 02 13:12:10 crc kubenswrapper[4724]: I1002 13:12:10.820967 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-spbdk" event={"ID":"a0f88112-a644-4c18-a7a9-0b52131640d5","Type":"ContainerStarted","Data":"7630cdece05f858c5e0fd24159529d069d9e47e62f5e078fe01bfd0bbe8688bb"} Oct 02 13:12:10 crc kubenswrapper[4724]: I1002 13:12:10.823357 4724 generic.go:334] "Generic (PLEG): container finished" podID="9810e031-6fc3-49eb-98c9-300541a267dc" containerID="c4c60973572ccd17a92b100d69968eaf04ff7726b392e278e403eb989504bf2a" exitCode=0 Oct 02 13:12:10 crc kubenswrapper[4724]: I1002 13:12:10.823399 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5nzcv" event={"ID":"9810e031-6fc3-49eb-98c9-300541a267dc","Type":"ContainerDied","Data":"c4c60973572ccd17a92b100d69968eaf04ff7726b392e278e403eb989504bf2a"} Oct 02 13:12:10 crc kubenswrapper[4724]: I1002 13:12:10.823430 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5nzcv" event={"ID":"9810e031-6fc3-49eb-98c9-300541a267dc","Type":"ContainerDied","Data":"1fe7de36714883b454cbe1f1cd729aa4581bfc8385601541d10484cdee115eee"} Oct 02 13:12:10 crc kubenswrapper[4724]: I1002 13:12:10.823455 4724 scope.go:117] "RemoveContainer" containerID="c4c60973572ccd17a92b100d69968eaf04ff7726b392e278e403eb989504bf2a" Oct 02 13:12:10 crc kubenswrapper[4724]: I1002 13:12:10.823443 4724 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-5nzcv" Oct 02 13:12:10 crc kubenswrapper[4724]: I1002 13:12:10.832579 4724 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9810e031-6fc3-49eb-98c9-300541a267dc-utilities\") on node \"crc\" DevicePath \"\"" Oct 02 13:12:10 crc kubenswrapper[4724]: I1002 13:12:10.832616 4724 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9810e031-6fc3-49eb-98c9-300541a267dc-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 02 13:12:10 crc kubenswrapper[4724]: I1002 13:12:10.832631 4724 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8kp76\" (UniqueName: \"kubernetes.io/projected/9810e031-6fc3-49eb-98c9-300541a267dc-kube-api-access-8kp76\") on node \"crc\" DevicePath \"\"" Oct 02 13:12:10 crc kubenswrapper[4724]: I1002 13:12:10.845599 4724 scope.go:117] "RemoveContainer" containerID="99aa9bf85086dedd44ac61e67f5ab107f015efbf9ae0fee45079e598cb902fec" Oct 02 13:12:10 crc kubenswrapper[4724]: I1002 13:12:10.868172 4724 scope.go:117] "RemoveContainer" containerID="40bb92dedb9e588a4a5c6d93940219c264f5fbaed571782918880bf59c667f83" Oct 02 13:12:10 crc kubenswrapper[4724]: I1002 13:12:10.871918 4724 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-5nzcv"] Oct 02 13:12:10 crc kubenswrapper[4724]: I1002 13:12:10.877077 4724 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-5nzcv"] Oct 02 13:12:10 crc kubenswrapper[4724]: I1002 13:12:10.884525 4724 scope.go:117] "RemoveContainer" containerID="c4c60973572ccd17a92b100d69968eaf04ff7726b392e278e403eb989504bf2a" Oct 02 13:12:10 crc kubenswrapper[4724]: E1002 13:12:10.885954 4724 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c4c60973572ccd17a92b100d69968eaf04ff7726b392e278e403eb989504bf2a\": container with ID starting with c4c60973572ccd17a92b100d69968eaf04ff7726b392e278e403eb989504bf2a not found: ID does not exist" containerID="c4c60973572ccd17a92b100d69968eaf04ff7726b392e278e403eb989504bf2a" Oct 02 13:12:10 crc kubenswrapper[4724]: I1002 13:12:10.886080 4724 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c4c60973572ccd17a92b100d69968eaf04ff7726b392e278e403eb989504bf2a"} err="failed to get container status \"c4c60973572ccd17a92b100d69968eaf04ff7726b392e278e403eb989504bf2a\": rpc error: code = NotFound desc = could not find container \"c4c60973572ccd17a92b100d69968eaf04ff7726b392e278e403eb989504bf2a\": container with ID starting with c4c60973572ccd17a92b100d69968eaf04ff7726b392e278e403eb989504bf2a not found: ID does not exist" Oct 02 13:12:10 crc kubenswrapper[4724]: I1002 13:12:10.886162 4724 scope.go:117] "RemoveContainer" containerID="99aa9bf85086dedd44ac61e67f5ab107f015efbf9ae0fee45079e598cb902fec" Oct 02 13:12:10 crc kubenswrapper[4724]: E1002 13:12:10.886506 4724 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"99aa9bf85086dedd44ac61e67f5ab107f015efbf9ae0fee45079e598cb902fec\": container with ID starting with 99aa9bf85086dedd44ac61e67f5ab107f015efbf9ae0fee45079e598cb902fec not found: ID does not exist" containerID="99aa9bf85086dedd44ac61e67f5ab107f015efbf9ae0fee45079e598cb902fec" Oct 02 13:12:10 crc kubenswrapper[4724]: I1002 13:12:10.886567 4724 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"99aa9bf85086dedd44ac61e67f5ab107f015efbf9ae0fee45079e598cb902fec"} err="failed to get container status \"99aa9bf85086dedd44ac61e67f5ab107f015efbf9ae0fee45079e598cb902fec\": rpc error: code = NotFound desc = could not find container \"99aa9bf85086dedd44ac61e67f5ab107f015efbf9ae0fee45079e598cb902fec\": container with ID starting with 99aa9bf85086dedd44ac61e67f5ab107f015efbf9ae0fee45079e598cb902fec not found: ID does not exist" Oct 02 13:12:10 crc kubenswrapper[4724]: I1002 13:12:10.886597 4724 scope.go:117] "RemoveContainer" containerID="40bb92dedb9e588a4a5c6d93940219c264f5fbaed571782918880bf59c667f83" Oct 02 13:12:10 crc kubenswrapper[4724]: E1002 13:12:10.886906 4724 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"40bb92dedb9e588a4a5c6d93940219c264f5fbaed571782918880bf59c667f83\": container with ID starting with 40bb92dedb9e588a4a5c6d93940219c264f5fbaed571782918880bf59c667f83 not found: ID does not exist" containerID="40bb92dedb9e588a4a5c6d93940219c264f5fbaed571782918880bf59c667f83" Oct 02 13:12:10 crc kubenswrapper[4724]: I1002 13:12:10.886926 4724 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"40bb92dedb9e588a4a5c6d93940219c264f5fbaed571782918880bf59c667f83"} err="failed to get container status \"40bb92dedb9e588a4a5c6d93940219c264f5fbaed571782918880bf59c667f83\": rpc error: code = NotFound desc = could not find container \"40bb92dedb9e588a4a5c6d93940219c264f5fbaed571782918880bf59c667f83\": container with ID starting with 40bb92dedb9e588a4a5c6d93940219c264f5fbaed571782918880bf59c667f83 not found: ID does not exist" Oct 02 13:12:11 crc kubenswrapper[4724]: I1002 13:12:11.829657 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-spbdk" event={"ID":"a0f88112-a644-4c18-a7a9-0b52131640d5","Type":"ContainerStarted","Data":"fa9808f3c53a15d61c987fd796f71a6722ecfba3935f9fbca3b8280ebbd47e22"} Oct 02 13:12:12 crc kubenswrapper[4724]: I1002 13:12:12.330619 4724 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9810e031-6fc3-49eb-98c9-300541a267dc" path="/var/lib/kubelet/pods/9810e031-6fc3-49eb-98c9-300541a267dc/volumes" Oct 02 13:12:12 crc kubenswrapper[4724]: I1002 13:12:12.837617 4724 generic.go:334] "Generic (PLEG): container finished" podID="a0f88112-a644-4c18-a7a9-0b52131640d5" containerID="fa9808f3c53a15d61c987fd796f71a6722ecfba3935f9fbca3b8280ebbd47e22" exitCode=0 Oct 02 13:12:12 crc kubenswrapper[4724]: I1002 13:12:12.837669 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-spbdk" event={"ID":"a0f88112-a644-4c18-a7a9-0b52131640d5","Type":"ContainerDied","Data":"fa9808f3c53a15d61c987fd796f71a6722ecfba3935f9fbca3b8280ebbd47e22"} Oct 02 13:12:13 crc kubenswrapper[4724]: I1002 13:12:13.847529 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-spbdk" event={"ID":"a0f88112-a644-4c18-a7a9-0b52131640d5","Type":"ContainerStarted","Data":"a32939db8d47a8740d6caaa10a202f798c33386d74b8a5f5ae3bc79cd4ae44c3"} Oct 02 13:12:13 crc kubenswrapper[4724]: I1002 13:12:13.878652 4724 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-spbdk" podStartSLOduration=2.427946392 podStartE2EDuration="4.878629175s" podCreationTimestamp="2025-10-02 13:12:09 +0000 UTC" firstStartedPulling="2025-10-02 13:12:10.831519957 +0000 UTC m=+795.286279078" lastFinishedPulling="2025-10-02 13:12:13.28220272 +0000 UTC m=+797.736961861" observedRunningTime="2025-10-02 13:12:13.873922833 +0000 UTC m=+798.328681954" watchObservedRunningTime="2025-10-02 13:12:13.878629175 +0000 UTC m=+798.333388296" Oct 02 13:12:17 crc kubenswrapper[4724]: I1002 13:12:17.854329 4724 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/infra-operator-index-8npp6"] Oct 02 13:12:17 crc kubenswrapper[4724]: E1002 13:12:17.855444 4724 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9810e031-6fc3-49eb-98c9-300541a267dc" containerName="registry-server" Oct 02 13:12:17 crc kubenswrapper[4724]: I1002 13:12:17.855480 4724 state_mem.go:107] "Deleted CPUSet assignment" podUID="9810e031-6fc3-49eb-98c9-300541a267dc" containerName="registry-server" Oct 02 13:12:17 crc kubenswrapper[4724]: E1002 13:12:17.855526 4724 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9810e031-6fc3-49eb-98c9-300541a267dc" containerName="extract-content" Oct 02 13:12:17 crc kubenswrapper[4724]: I1002 13:12:17.855587 4724 state_mem.go:107] "Deleted CPUSet assignment" podUID="9810e031-6fc3-49eb-98c9-300541a267dc" containerName="extract-content" Oct 02 13:12:17 crc kubenswrapper[4724]: E1002 13:12:17.855612 4724 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9810e031-6fc3-49eb-98c9-300541a267dc" containerName="extract-utilities" Oct 02 13:12:17 crc kubenswrapper[4724]: I1002 13:12:17.855632 4724 state_mem.go:107] "Deleted CPUSet assignment" podUID="9810e031-6fc3-49eb-98c9-300541a267dc" containerName="extract-utilities" Oct 02 13:12:17 crc kubenswrapper[4724]: I1002 13:12:17.855907 4724 memory_manager.go:354] "RemoveStaleState removing state" podUID="9810e031-6fc3-49eb-98c9-300541a267dc" containerName="registry-server" Oct 02 13:12:17 crc kubenswrapper[4724]: I1002 13:12:17.856971 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-index-8npp6" Oct 02 13:12:17 crc kubenswrapper[4724]: I1002 13:12:17.869349 4724 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-index-dockercfg-5pfkq" Oct 02 13:12:17 crc kubenswrapper[4724]: I1002 13:12:17.884795 4724 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-index-8npp6"] Oct 02 13:12:17 crc kubenswrapper[4724]: I1002 13:12:17.925451 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2fnnn\" (UniqueName: \"kubernetes.io/projected/3c19afae-cac6-4547-bdde-09217fbd153a-kube-api-access-2fnnn\") pod \"infra-operator-index-8npp6\" (UID: \"3c19afae-cac6-4547-bdde-09217fbd153a\") " pod="openstack-operators/infra-operator-index-8npp6" Oct 02 13:12:18 crc kubenswrapper[4724]: I1002 13:12:18.026643 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2fnnn\" (UniqueName: \"kubernetes.io/projected/3c19afae-cac6-4547-bdde-09217fbd153a-kube-api-access-2fnnn\") pod \"infra-operator-index-8npp6\" (UID: \"3c19afae-cac6-4547-bdde-09217fbd153a\") " pod="openstack-operators/infra-operator-index-8npp6" Oct 02 13:12:18 crc kubenswrapper[4724]: I1002 13:12:18.071927 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2fnnn\" (UniqueName: \"kubernetes.io/projected/3c19afae-cac6-4547-bdde-09217fbd153a-kube-api-access-2fnnn\") pod \"infra-operator-index-8npp6\" (UID: \"3c19afae-cac6-4547-bdde-09217fbd153a\") " pod="openstack-operators/infra-operator-index-8npp6" Oct 02 13:12:18 crc kubenswrapper[4724]: I1002 13:12:18.198036 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-index-8npp6" Oct 02 13:12:18 crc kubenswrapper[4724]: I1002 13:12:18.605111 4724 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-index-8npp6"] Oct 02 13:12:18 crc kubenswrapper[4724]: I1002 13:12:18.876706 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-index-8npp6" event={"ID":"3c19afae-cac6-4547-bdde-09217fbd153a","Type":"ContainerStarted","Data":"be5992c913514be35c299baeffbd55b8e606825aefca468e5f0ae7fa42c8881c"} Oct 02 13:12:19 crc kubenswrapper[4724]: I1002 13:12:19.566678 4724 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-spbdk" Oct 02 13:12:19 crc kubenswrapper[4724]: I1002 13:12:19.566732 4724 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-spbdk" Oct 02 13:12:19 crc kubenswrapper[4724]: I1002 13:12:19.615786 4724 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-spbdk" Oct 02 13:12:19 crc kubenswrapper[4724]: I1002 13:12:19.919183 4724 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-spbdk" Oct 02 13:12:20 crc kubenswrapper[4724]: I1002 13:12:20.891106 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-index-8npp6" event={"ID":"3c19afae-cac6-4547-bdde-09217fbd153a","Type":"ContainerStarted","Data":"50049b76189a10ebeae87dd511bfc705dd680f505ebfd62ff52581ec2abee1bc"} Oct 02 13:12:20 crc kubenswrapper[4724]: I1002 13:12:20.913354 4724 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/infra-operator-index-8npp6" podStartSLOduration=2.073215834 podStartE2EDuration="3.913331224s" podCreationTimestamp="2025-10-02 13:12:17 +0000 UTC" firstStartedPulling="2025-10-02 13:12:18.614691854 +0000 UTC m=+803.069450975" lastFinishedPulling="2025-10-02 13:12:20.454807244 +0000 UTC m=+804.909566365" observedRunningTime="2025-10-02 13:12:20.911398424 +0000 UTC m=+805.366157565" watchObservedRunningTime="2025-10-02 13:12:20.913331224 +0000 UTC m=+805.368090365" Oct 02 13:12:23 crc kubenswrapper[4724]: I1002 13:12:23.232517 4724 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/infra-operator-index-8npp6"] Oct 02 13:12:23 crc kubenswrapper[4724]: I1002 13:12:23.232996 4724 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/infra-operator-index-8npp6" podUID="3c19afae-cac6-4547-bdde-09217fbd153a" containerName="registry-server" containerID="cri-o://50049b76189a10ebeae87dd511bfc705dd680f505ebfd62ff52581ec2abee1bc" gracePeriod=2 Oct 02 13:12:23 crc kubenswrapper[4724]: I1002 13:12:23.590220 4724 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-index-8npp6" Oct 02 13:12:23 crc kubenswrapper[4724]: I1002 13:12:23.702461 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2fnnn\" (UniqueName: \"kubernetes.io/projected/3c19afae-cac6-4547-bdde-09217fbd153a-kube-api-access-2fnnn\") pod \"3c19afae-cac6-4547-bdde-09217fbd153a\" (UID: \"3c19afae-cac6-4547-bdde-09217fbd153a\") " Oct 02 13:12:23 crc kubenswrapper[4724]: I1002 13:12:23.707830 4724 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3c19afae-cac6-4547-bdde-09217fbd153a-kube-api-access-2fnnn" (OuterVolumeSpecName: "kube-api-access-2fnnn") pod "3c19afae-cac6-4547-bdde-09217fbd153a" (UID: "3c19afae-cac6-4547-bdde-09217fbd153a"). InnerVolumeSpecName "kube-api-access-2fnnn". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 13:12:23 crc kubenswrapper[4724]: I1002 13:12:23.804441 4724 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2fnnn\" (UniqueName: \"kubernetes.io/projected/3c19afae-cac6-4547-bdde-09217fbd153a-kube-api-access-2fnnn\") on node \"crc\" DevicePath \"\"" Oct 02 13:12:23 crc kubenswrapper[4724]: I1002 13:12:23.910955 4724 generic.go:334] "Generic (PLEG): container finished" podID="3c19afae-cac6-4547-bdde-09217fbd153a" containerID="50049b76189a10ebeae87dd511bfc705dd680f505ebfd62ff52581ec2abee1bc" exitCode=0 Oct 02 13:12:23 crc kubenswrapper[4724]: I1002 13:12:23.911007 4724 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-index-8npp6" Oct 02 13:12:23 crc kubenswrapper[4724]: I1002 13:12:23.911029 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-index-8npp6" event={"ID":"3c19afae-cac6-4547-bdde-09217fbd153a","Type":"ContainerDied","Data":"50049b76189a10ebeae87dd511bfc705dd680f505ebfd62ff52581ec2abee1bc"} Oct 02 13:12:23 crc kubenswrapper[4724]: I1002 13:12:23.911468 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-index-8npp6" event={"ID":"3c19afae-cac6-4547-bdde-09217fbd153a","Type":"ContainerDied","Data":"be5992c913514be35c299baeffbd55b8e606825aefca468e5f0ae7fa42c8881c"} Oct 02 13:12:23 crc kubenswrapper[4724]: I1002 13:12:23.911501 4724 scope.go:117] "RemoveContainer" containerID="50049b76189a10ebeae87dd511bfc705dd680f505ebfd62ff52581ec2abee1bc" Oct 02 13:12:23 crc kubenswrapper[4724]: I1002 13:12:23.935980 4724 scope.go:117] "RemoveContainer" containerID="50049b76189a10ebeae87dd511bfc705dd680f505ebfd62ff52581ec2abee1bc" Oct 02 13:12:23 crc kubenswrapper[4724]: E1002 13:12:23.936703 4724 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"50049b76189a10ebeae87dd511bfc705dd680f505ebfd62ff52581ec2abee1bc\": container with ID starting with 50049b76189a10ebeae87dd511bfc705dd680f505ebfd62ff52581ec2abee1bc not found: ID does not exist" containerID="50049b76189a10ebeae87dd511bfc705dd680f505ebfd62ff52581ec2abee1bc" Oct 02 13:12:23 crc kubenswrapper[4724]: I1002 13:12:23.936754 4724 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"50049b76189a10ebeae87dd511bfc705dd680f505ebfd62ff52581ec2abee1bc"} err="failed to get container status \"50049b76189a10ebeae87dd511bfc705dd680f505ebfd62ff52581ec2abee1bc\": rpc error: code = NotFound desc = could not find container \"50049b76189a10ebeae87dd511bfc705dd680f505ebfd62ff52581ec2abee1bc\": container with ID starting with 50049b76189a10ebeae87dd511bfc705dd680f505ebfd62ff52581ec2abee1bc not found: ID does not exist" Oct 02 13:12:23 crc kubenswrapper[4724]: I1002 13:12:23.955351 4724 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/infra-operator-index-8npp6"] Oct 02 13:12:23 crc kubenswrapper[4724]: I1002 13:12:23.958298 4724 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/infra-operator-index-8npp6"] Oct 02 13:12:24 crc kubenswrapper[4724]: I1002 13:12:24.037747 4724 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/infra-operator-index-5h4lj"] Oct 02 13:12:24 crc kubenswrapper[4724]: E1002 13:12:24.037994 4724 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3c19afae-cac6-4547-bdde-09217fbd153a" containerName="registry-server" Oct 02 13:12:24 crc kubenswrapper[4724]: I1002 13:12:24.038014 4724 state_mem.go:107] "Deleted CPUSet assignment" podUID="3c19afae-cac6-4547-bdde-09217fbd153a" containerName="registry-server" Oct 02 13:12:24 crc kubenswrapper[4724]: I1002 13:12:24.038134 4724 memory_manager.go:354] "RemoveStaleState removing state" podUID="3c19afae-cac6-4547-bdde-09217fbd153a" containerName="registry-server" Oct 02 13:12:24 crc kubenswrapper[4724]: I1002 13:12:24.038500 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-index-5h4lj" Oct 02 13:12:24 crc kubenswrapper[4724]: I1002 13:12:24.041102 4724 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-index-dockercfg-5pfkq" Oct 02 13:12:24 crc kubenswrapper[4724]: I1002 13:12:24.051329 4724 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-index-5h4lj"] Oct 02 13:12:24 crc kubenswrapper[4724]: I1002 13:12:24.108082 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-stfw8\" (UniqueName: \"kubernetes.io/projected/e19f04a2-01f2-43d1-b30c-9a0b2e9662b1-kube-api-access-stfw8\") pod \"infra-operator-index-5h4lj\" (UID: \"e19f04a2-01f2-43d1-b30c-9a0b2e9662b1\") " pod="openstack-operators/infra-operator-index-5h4lj" Oct 02 13:12:24 crc kubenswrapper[4724]: I1002 13:12:24.209671 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-stfw8\" (UniqueName: \"kubernetes.io/projected/e19f04a2-01f2-43d1-b30c-9a0b2e9662b1-kube-api-access-stfw8\") pod \"infra-operator-index-5h4lj\" (UID: \"e19f04a2-01f2-43d1-b30c-9a0b2e9662b1\") " pod="openstack-operators/infra-operator-index-5h4lj" Oct 02 13:12:24 crc kubenswrapper[4724]: I1002 13:12:24.225822 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-stfw8\" (UniqueName: \"kubernetes.io/projected/e19f04a2-01f2-43d1-b30c-9a0b2e9662b1-kube-api-access-stfw8\") pod \"infra-operator-index-5h4lj\" (UID: \"e19f04a2-01f2-43d1-b30c-9a0b2e9662b1\") " pod="openstack-operators/infra-operator-index-5h4lj" Oct 02 13:12:24 crc kubenswrapper[4724]: I1002 13:12:24.319863 4724 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3c19afae-cac6-4547-bdde-09217fbd153a" path="/var/lib/kubelet/pods/3c19afae-cac6-4547-bdde-09217fbd153a/volumes" Oct 02 13:12:24 crc kubenswrapper[4724]: I1002 13:12:24.355194 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-index-5h4lj" Oct 02 13:12:24 crc kubenswrapper[4724]: I1002 13:12:24.432002 4724 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-spbdk"] Oct 02 13:12:24 crc kubenswrapper[4724]: I1002 13:12:24.432217 4724 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-spbdk" podUID="a0f88112-a644-4c18-a7a9-0b52131640d5" containerName="registry-server" containerID="cri-o://a32939db8d47a8740d6caaa10a202f798c33386d74b8a5f5ae3bc79cd4ae44c3" gracePeriod=2 Oct 02 13:12:24 crc kubenswrapper[4724]: I1002 13:12:24.777799 4724 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-index-5h4lj"] Oct 02 13:12:24 crc kubenswrapper[4724]: W1002 13:12:24.785643 4724 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode19f04a2_01f2_43d1_b30c_9a0b2e9662b1.slice/crio-a113bccc9607b2d1f22f65fde63d221a1a3cb38a0e0b8afa1e7fd3167a1d8431 WatchSource:0}: Error finding container a113bccc9607b2d1f22f65fde63d221a1a3cb38a0e0b8afa1e7fd3167a1d8431: Status 404 returned error can't find the container with id a113bccc9607b2d1f22f65fde63d221a1a3cb38a0e0b8afa1e7fd3167a1d8431 Oct 02 13:12:24 crc kubenswrapper[4724]: I1002 13:12:24.848395 4724 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-spbdk" Oct 02 13:12:24 crc kubenswrapper[4724]: I1002 13:12:24.917771 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a0f88112-a644-4c18-a7a9-0b52131640d5-catalog-content\") pod \"a0f88112-a644-4c18-a7a9-0b52131640d5\" (UID: \"a0f88112-a644-4c18-a7a9-0b52131640d5\") " Oct 02 13:12:24 crc kubenswrapper[4724]: I1002 13:12:24.917826 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a0f88112-a644-4c18-a7a9-0b52131640d5-utilities\") pod \"a0f88112-a644-4c18-a7a9-0b52131640d5\" (UID: \"a0f88112-a644-4c18-a7a9-0b52131640d5\") " Oct 02 13:12:24 crc kubenswrapper[4724]: I1002 13:12:24.917938 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sd89z\" (UniqueName: \"kubernetes.io/projected/a0f88112-a644-4c18-a7a9-0b52131640d5-kube-api-access-sd89z\") pod \"a0f88112-a644-4c18-a7a9-0b52131640d5\" (UID: \"a0f88112-a644-4c18-a7a9-0b52131640d5\") " Oct 02 13:12:24 crc kubenswrapper[4724]: I1002 13:12:24.918943 4724 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a0f88112-a644-4c18-a7a9-0b52131640d5-utilities" (OuterVolumeSpecName: "utilities") pod "a0f88112-a644-4c18-a7a9-0b52131640d5" (UID: "a0f88112-a644-4c18-a7a9-0b52131640d5"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 13:12:24 crc kubenswrapper[4724]: I1002 13:12:24.919141 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-index-5h4lj" event={"ID":"e19f04a2-01f2-43d1-b30c-9a0b2e9662b1","Type":"ContainerStarted","Data":"a113bccc9607b2d1f22f65fde63d221a1a3cb38a0e0b8afa1e7fd3167a1d8431"} Oct 02 13:12:24 crc kubenswrapper[4724]: I1002 13:12:24.923037 4724 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0f88112-a644-4c18-a7a9-0b52131640d5-kube-api-access-sd89z" (OuterVolumeSpecName: "kube-api-access-sd89z") pod "a0f88112-a644-4c18-a7a9-0b52131640d5" (UID: "a0f88112-a644-4c18-a7a9-0b52131640d5"). InnerVolumeSpecName "kube-api-access-sd89z". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 13:12:24 crc kubenswrapper[4724]: I1002 13:12:24.925119 4724 generic.go:334] "Generic (PLEG): container finished" podID="a0f88112-a644-4c18-a7a9-0b52131640d5" containerID="a32939db8d47a8740d6caaa10a202f798c33386d74b8a5f5ae3bc79cd4ae44c3" exitCode=0 Oct 02 13:12:24 crc kubenswrapper[4724]: I1002 13:12:24.925159 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-spbdk" event={"ID":"a0f88112-a644-4c18-a7a9-0b52131640d5","Type":"ContainerDied","Data":"a32939db8d47a8740d6caaa10a202f798c33386d74b8a5f5ae3bc79cd4ae44c3"} Oct 02 13:12:24 crc kubenswrapper[4724]: I1002 13:12:24.925189 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-spbdk" event={"ID":"a0f88112-a644-4c18-a7a9-0b52131640d5","Type":"ContainerDied","Data":"7630cdece05f858c5e0fd24159529d069d9e47e62f5e078fe01bfd0bbe8688bb"} Oct 02 13:12:24 crc kubenswrapper[4724]: I1002 13:12:24.925210 4724 scope.go:117] "RemoveContainer" containerID="a32939db8d47a8740d6caaa10a202f798c33386d74b8a5f5ae3bc79cd4ae44c3" Oct 02 13:12:24 crc kubenswrapper[4724]: I1002 13:12:24.925259 4724 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-spbdk" Oct 02 13:12:24 crc kubenswrapper[4724]: I1002 13:12:24.944960 4724 scope.go:117] "RemoveContainer" containerID="fa9808f3c53a15d61c987fd796f71a6722ecfba3935f9fbca3b8280ebbd47e22" Oct 02 13:12:24 crc kubenswrapper[4724]: I1002 13:12:24.993418 4724 scope.go:117] "RemoveContainer" containerID="fc019c31d2c7528ea0b7bdf0b5c05a42d51652e49429f77c5af2f7de6ce6b4aa" Oct 02 13:12:25 crc kubenswrapper[4724]: I1002 13:12:25.015850 4724 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a0f88112-a644-4c18-a7a9-0b52131640d5-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a0f88112-a644-4c18-a7a9-0b52131640d5" (UID: "a0f88112-a644-4c18-a7a9-0b52131640d5"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 13:12:25 crc kubenswrapper[4724]: I1002 13:12:25.020266 4724 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sd89z\" (UniqueName: \"kubernetes.io/projected/a0f88112-a644-4c18-a7a9-0b52131640d5-kube-api-access-sd89z\") on node \"crc\" DevicePath \"\"" Oct 02 13:12:25 crc kubenswrapper[4724]: I1002 13:12:25.020310 4724 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a0f88112-a644-4c18-a7a9-0b52131640d5-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 02 13:12:25 crc kubenswrapper[4724]: I1002 13:12:25.020320 4724 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a0f88112-a644-4c18-a7a9-0b52131640d5-utilities\") on node \"crc\" DevicePath \"\"" Oct 02 13:12:25 crc kubenswrapper[4724]: I1002 13:12:25.027946 4724 scope.go:117] "RemoveContainer" containerID="a32939db8d47a8740d6caaa10a202f798c33386d74b8a5f5ae3bc79cd4ae44c3" Oct 02 13:12:25 crc kubenswrapper[4724]: E1002 13:12:25.028476 4724 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a32939db8d47a8740d6caaa10a202f798c33386d74b8a5f5ae3bc79cd4ae44c3\": container with ID starting with a32939db8d47a8740d6caaa10a202f798c33386d74b8a5f5ae3bc79cd4ae44c3 not found: ID does not exist" containerID="a32939db8d47a8740d6caaa10a202f798c33386d74b8a5f5ae3bc79cd4ae44c3" Oct 02 13:12:25 crc kubenswrapper[4724]: I1002 13:12:25.028514 4724 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a32939db8d47a8740d6caaa10a202f798c33386d74b8a5f5ae3bc79cd4ae44c3"} err="failed to get container status \"a32939db8d47a8740d6caaa10a202f798c33386d74b8a5f5ae3bc79cd4ae44c3\": rpc error: code = NotFound desc = could not find container \"a32939db8d47a8740d6caaa10a202f798c33386d74b8a5f5ae3bc79cd4ae44c3\": container with ID starting with a32939db8d47a8740d6caaa10a202f798c33386d74b8a5f5ae3bc79cd4ae44c3 not found: ID does not exist" Oct 02 13:12:25 crc kubenswrapper[4724]: I1002 13:12:25.028587 4724 scope.go:117] "RemoveContainer" containerID="fa9808f3c53a15d61c987fd796f71a6722ecfba3935f9fbca3b8280ebbd47e22" Oct 02 13:12:25 crc kubenswrapper[4724]: E1002 13:12:25.028853 4724 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fa9808f3c53a15d61c987fd796f71a6722ecfba3935f9fbca3b8280ebbd47e22\": container with ID starting with fa9808f3c53a15d61c987fd796f71a6722ecfba3935f9fbca3b8280ebbd47e22 not found: ID does not exist" containerID="fa9808f3c53a15d61c987fd796f71a6722ecfba3935f9fbca3b8280ebbd47e22" Oct 02 13:12:25 crc kubenswrapper[4724]: I1002 13:12:25.028877 4724 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fa9808f3c53a15d61c987fd796f71a6722ecfba3935f9fbca3b8280ebbd47e22"} err="failed to get container status \"fa9808f3c53a15d61c987fd796f71a6722ecfba3935f9fbca3b8280ebbd47e22\": rpc error: code = NotFound desc = could not find container \"fa9808f3c53a15d61c987fd796f71a6722ecfba3935f9fbca3b8280ebbd47e22\": container with ID starting with fa9808f3c53a15d61c987fd796f71a6722ecfba3935f9fbca3b8280ebbd47e22 not found: ID does not exist" Oct 02 13:12:25 crc kubenswrapper[4724]: I1002 13:12:25.028893 4724 scope.go:117] "RemoveContainer" containerID="fc019c31d2c7528ea0b7bdf0b5c05a42d51652e49429f77c5af2f7de6ce6b4aa" Oct 02 13:12:25 crc kubenswrapper[4724]: E1002 13:12:25.029118 4724 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fc019c31d2c7528ea0b7bdf0b5c05a42d51652e49429f77c5af2f7de6ce6b4aa\": container with ID starting with fc019c31d2c7528ea0b7bdf0b5c05a42d51652e49429f77c5af2f7de6ce6b4aa not found: ID does not exist" containerID="fc019c31d2c7528ea0b7bdf0b5c05a42d51652e49429f77c5af2f7de6ce6b4aa" Oct 02 13:12:25 crc kubenswrapper[4724]: I1002 13:12:25.029145 4724 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fc019c31d2c7528ea0b7bdf0b5c05a42d51652e49429f77c5af2f7de6ce6b4aa"} err="failed to get container status \"fc019c31d2c7528ea0b7bdf0b5c05a42d51652e49429f77c5af2f7de6ce6b4aa\": rpc error: code = NotFound desc = could not find container \"fc019c31d2c7528ea0b7bdf0b5c05a42d51652e49429f77c5af2f7de6ce6b4aa\": container with ID starting with fc019c31d2c7528ea0b7bdf0b5c05a42d51652e49429f77c5af2f7de6ce6b4aa not found: ID does not exist" Oct 02 13:12:25 crc kubenswrapper[4724]: I1002 13:12:25.255253 4724 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-spbdk"] Oct 02 13:12:25 crc kubenswrapper[4724]: I1002 13:12:25.258503 4724 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-spbdk"] Oct 02 13:12:25 crc kubenswrapper[4724]: I1002 13:12:25.934127 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-index-5h4lj" event={"ID":"e19f04a2-01f2-43d1-b30c-9a0b2e9662b1","Type":"ContainerStarted","Data":"f752fe2310d140b91014c07a5e8c01a1b5ebffde1269446a88e968f4d1eb9f74"} Oct 02 13:12:25 crc kubenswrapper[4724]: I1002 13:12:25.954502 4724 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/infra-operator-index-5h4lj" podStartSLOduration=1.370622167 podStartE2EDuration="1.954479795s" podCreationTimestamp="2025-10-02 13:12:24 +0000 UTC" firstStartedPulling="2025-10-02 13:12:24.78871169 +0000 UTC m=+809.243470811" lastFinishedPulling="2025-10-02 13:12:25.372569318 +0000 UTC m=+809.827328439" observedRunningTime="2025-10-02 13:12:25.952156524 +0000 UTC m=+810.406915645" watchObservedRunningTime="2025-10-02 13:12:25.954479795 +0000 UTC m=+810.409238916" Oct 02 13:12:26 crc kubenswrapper[4724]: I1002 13:12:26.320627 4724 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0f88112-a644-4c18-a7a9-0b52131640d5" path="/var/lib/kubelet/pods/a0f88112-a644-4c18-a7a9-0b52131640d5/volumes" Oct 02 13:12:34 crc kubenswrapper[4724]: I1002 13:12:34.355563 4724 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/infra-operator-index-5h4lj" Oct 02 13:12:34 crc kubenswrapper[4724]: I1002 13:12:34.356115 4724 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/infra-operator-index-5h4lj" Oct 02 13:12:34 crc kubenswrapper[4724]: I1002 13:12:34.391502 4724 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/infra-operator-index-5h4lj" Oct 02 13:12:34 crc kubenswrapper[4724]: I1002 13:12:34.734235 4724 patch_prober.go:28] interesting pod/machine-config-daemon-74k4t container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 02 13:12:34 crc kubenswrapper[4724]: I1002 13:12:34.734327 4724 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-74k4t" podUID="f6090eaa-c182-4788-950c-16352c271233" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 02 13:12:34 crc kubenswrapper[4724]: I1002 13:12:34.734392 4724 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-74k4t" Oct 02 13:12:34 crc kubenswrapper[4724]: I1002 13:12:34.735137 4724 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"dec5500d23c7c01c852c0e2b0478c7abcfcd5e96c2408d1a5b49547642fc64d1"} pod="openshift-machine-config-operator/machine-config-daemon-74k4t" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 02 13:12:34 crc kubenswrapper[4724]: I1002 13:12:34.735206 4724 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-74k4t" podUID="f6090eaa-c182-4788-950c-16352c271233" containerName="machine-config-daemon" containerID="cri-o://dec5500d23c7c01c852c0e2b0478c7abcfcd5e96c2408d1a5b49547642fc64d1" gracePeriod=600 Oct 02 13:12:34 crc kubenswrapper[4724]: I1002 13:12:34.982138 4724 generic.go:334] "Generic (PLEG): container finished" podID="f6090eaa-c182-4788-950c-16352c271233" containerID="dec5500d23c7c01c852c0e2b0478c7abcfcd5e96c2408d1a5b49547642fc64d1" exitCode=0 Oct 02 13:12:34 crc kubenswrapper[4724]: I1002 13:12:34.982324 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-74k4t" event={"ID":"f6090eaa-c182-4788-950c-16352c271233","Type":"ContainerDied","Data":"dec5500d23c7c01c852c0e2b0478c7abcfcd5e96c2408d1a5b49547642fc64d1"} Oct 02 13:12:34 crc kubenswrapper[4724]: I1002 13:12:34.982670 4724 scope.go:117] "RemoveContainer" containerID="d398edf87fc8194d9c33ea4b32752e88bb2fb64c569afeb27d5aad73c63d4d57" Oct 02 13:12:35 crc kubenswrapper[4724]: I1002 13:12:35.005050 4724 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/infra-operator-index-5h4lj" Oct 02 13:12:35 crc kubenswrapper[4724]: I1002 13:12:35.991125 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-74k4t" event={"ID":"f6090eaa-c182-4788-950c-16352c271233","Type":"ContainerStarted","Data":"0a68cb5a6d61b6854f57fe6390e5dda2f41ea0bca0a949b5592be96925084795"} Oct 02 13:12:36 crc kubenswrapper[4724]: I1002 13:12:36.073490 4724 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ec20a04ef7278338c96ca90950ec47944973b8553e1da5c6f2ce730402cv6v2"] Oct 02 13:12:36 crc kubenswrapper[4724]: E1002 13:12:36.073755 4724 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a0f88112-a644-4c18-a7a9-0b52131640d5" containerName="registry-server" Oct 02 13:12:36 crc kubenswrapper[4724]: I1002 13:12:36.073769 4724 state_mem.go:107] "Deleted CPUSet assignment" podUID="a0f88112-a644-4c18-a7a9-0b52131640d5" containerName="registry-server" Oct 02 13:12:36 crc kubenswrapper[4724]: E1002 13:12:36.073782 4724 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a0f88112-a644-4c18-a7a9-0b52131640d5" containerName="extract-content" Oct 02 13:12:36 crc kubenswrapper[4724]: I1002 13:12:36.073789 4724 state_mem.go:107] "Deleted CPUSet assignment" podUID="a0f88112-a644-4c18-a7a9-0b52131640d5" containerName="extract-content" Oct 02 13:12:36 crc kubenswrapper[4724]: E1002 13:12:36.073808 4724 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a0f88112-a644-4c18-a7a9-0b52131640d5" containerName="extract-utilities" Oct 02 13:12:36 crc kubenswrapper[4724]: I1002 13:12:36.073814 4724 state_mem.go:107] "Deleted CPUSet assignment" podUID="a0f88112-a644-4c18-a7a9-0b52131640d5" containerName="extract-utilities" Oct 02 13:12:36 crc kubenswrapper[4724]: I1002 13:12:36.073915 4724 memory_manager.go:354] "RemoveStaleState removing state" podUID="a0f88112-a644-4c18-a7a9-0b52131640d5" containerName="registry-server" Oct 02 13:12:36 crc kubenswrapper[4724]: I1002 13:12:36.074679 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ec20a04ef7278338c96ca90950ec47944973b8553e1da5c6f2ce730402cv6v2" Oct 02 13:12:36 crc kubenswrapper[4724]: I1002 13:12:36.076310 4724 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"default-dockercfg-j9mmh" Oct 02 13:12:36 crc kubenswrapper[4724]: I1002 13:12:36.083696 4724 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ec20a04ef7278338c96ca90950ec47944973b8553e1da5c6f2ce730402cv6v2"] Oct 02 13:12:36 crc kubenswrapper[4724]: I1002 13:12:36.268048 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/61201eac-406d-4bed-a59f-aa4fe87eebca-util\") pod \"ec20a04ef7278338c96ca90950ec47944973b8553e1da5c6f2ce730402cv6v2\" (UID: \"61201eac-406d-4bed-a59f-aa4fe87eebca\") " pod="openstack-operators/ec20a04ef7278338c96ca90950ec47944973b8553e1da5c6f2ce730402cv6v2" Oct 02 13:12:36 crc kubenswrapper[4724]: I1002 13:12:36.268368 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/61201eac-406d-4bed-a59f-aa4fe87eebca-bundle\") pod \"ec20a04ef7278338c96ca90950ec47944973b8553e1da5c6f2ce730402cv6v2\" (UID: \"61201eac-406d-4bed-a59f-aa4fe87eebca\") " pod="openstack-operators/ec20a04ef7278338c96ca90950ec47944973b8553e1da5c6f2ce730402cv6v2" Oct 02 13:12:36 crc kubenswrapper[4724]: I1002 13:12:36.268403 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t6kpl\" (UniqueName: \"kubernetes.io/projected/61201eac-406d-4bed-a59f-aa4fe87eebca-kube-api-access-t6kpl\") pod \"ec20a04ef7278338c96ca90950ec47944973b8553e1da5c6f2ce730402cv6v2\" (UID: \"61201eac-406d-4bed-a59f-aa4fe87eebca\") " pod="openstack-operators/ec20a04ef7278338c96ca90950ec47944973b8553e1da5c6f2ce730402cv6v2" Oct 02 13:12:36 crc kubenswrapper[4724]: I1002 13:12:36.369319 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/61201eac-406d-4bed-a59f-aa4fe87eebca-util\") pod \"ec20a04ef7278338c96ca90950ec47944973b8553e1da5c6f2ce730402cv6v2\" (UID: \"61201eac-406d-4bed-a59f-aa4fe87eebca\") " pod="openstack-operators/ec20a04ef7278338c96ca90950ec47944973b8553e1da5c6f2ce730402cv6v2" Oct 02 13:12:36 crc kubenswrapper[4724]: I1002 13:12:36.369370 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/61201eac-406d-4bed-a59f-aa4fe87eebca-bundle\") pod \"ec20a04ef7278338c96ca90950ec47944973b8553e1da5c6f2ce730402cv6v2\" (UID: \"61201eac-406d-4bed-a59f-aa4fe87eebca\") " pod="openstack-operators/ec20a04ef7278338c96ca90950ec47944973b8553e1da5c6f2ce730402cv6v2" Oct 02 13:12:36 crc kubenswrapper[4724]: I1002 13:12:36.369391 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t6kpl\" (UniqueName: \"kubernetes.io/projected/61201eac-406d-4bed-a59f-aa4fe87eebca-kube-api-access-t6kpl\") pod \"ec20a04ef7278338c96ca90950ec47944973b8553e1da5c6f2ce730402cv6v2\" (UID: \"61201eac-406d-4bed-a59f-aa4fe87eebca\") " pod="openstack-operators/ec20a04ef7278338c96ca90950ec47944973b8553e1da5c6f2ce730402cv6v2" Oct 02 13:12:36 crc kubenswrapper[4724]: I1002 13:12:36.370043 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/61201eac-406d-4bed-a59f-aa4fe87eebca-bundle\") pod \"ec20a04ef7278338c96ca90950ec47944973b8553e1da5c6f2ce730402cv6v2\" (UID: \"61201eac-406d-4bed-a59f-aa4fe87eebca\") " pod="openstack-operators/ec20a04ef7278338c96ca90950ec47944973b8553e1da5c6f2ce730402cv6v2" Oct 02 13:12:36 crc kubenswrapper[4724]: I1002 13:12:36.370046 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/61201eac-406d-4bed-a59f-aa4fe87eebca-util\") pod \"ec20a04ef7278338c96ca90950ec47944973b8553e1da5c6f2ce730402cv6v2\" (UID: \"61201eac-406d-4bed-a59f-aa4fe87eebca\") " pod="openstack-operators/ec20a04ef7278338c96ca90950ec47944973b8553e1da5c6f2ce730402cv6v2" Oct 02 13:12:36 crc kubenswrapper[4724]: I1002 13:12:36.399569 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t6kpl\" (UniqueName: \"kubernetes.io/projected/61201eac-406d-4bed-a59f-aa4fe87eebca-kube-api-access-t6kpl\") pod \"ec20a04ef7278338c96ca90950ec47944973b8553e1da5c6f2ce730402cv6v2\" (UID: \"61201eac-406d-4bed-a59f-aa4fe87eebca\") " pod="openstack-operators/ec20a04ef7278338c96ca90950ec47944973b8553e1da5c6f2ce730402cv6v2" Oct 02 13:12:36 crc kubenswrapper[4724]: I1002 13:12:36.690734 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ec20a04ef7278338c96ca90950ec47944973b8553e1da5c6f2ce730402cv6v2" Oct 02 13:12:37 crc kubenswrapper[4724]: I1002 13:12:37.082845 4724 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ec20a04ef7278338c96ca90950ec47944973b8553e1da5c6f2ce730402cv6v2"] Oct 02 13:12:37 crc kubenswrapper[4724]: W1002 13:12:37.087405 4724 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod61201eac_406d_4bed_a59f_aa4fe87eebca.slice/crio-1389a97604d3052f02aa4c1c25c739c9fd9877145566abdda178595b78345b17 WatchSource:0}: Error finding container 1389a97604d3052f02aa4c1c25c739c9fd9877145566abdda178595b78345b17: Status 404 returned error can't find the container with id 1389a97604d3052f02aa4c1c25c739c9fd9877145566abdda178595b78345b17 Oct 02 13:12:38 crc kubenswrapper[4724]: I1002 13:12:38.002392 4724 generic.go:334] "Generic (PLEG): container finished" podID="61201eac-406d-4bed-a59f-aa4fe87eebca" containerID="b6e3912a94316d23897a4ed97f91e0a2b55345aa773ca1a7f472c3e699365191" exitCode=0 Oct 02 13:12:38 crc kubenswrapper[4724]: I1002 13:12:38.002485 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ec20a04ef7278338c96ca90950ec47944973b8553e1da5c6f2ce730402cv6v2" event={"ID":"61201eac-406d-4bed-a59f-aa4fe87eebca","Type":"ContainerDied","Data":"b6e3912a94316d23897a4ed97f91e0a2b55345aa773ca1a7f472c3e699365191"} Oct 02 13:12:38 crc kubenswrapper[4724]: I1002 13:12:38.002810 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ec20a04ef7278338c96ca90950ec47944973b8553e1da5c6f2ce730402cv6v2" event={"ID":"61201eac-406d-4bed-a59f-aa4fe87eebca","Type":"ContainerStarted","Data":"1389a97604d3052f02aa4c1c25c739c9fd9877145566abdda178595b78345b17"} Oct 02 13:12:39 crc kubenswrapper[4724]: I1002 13:12:39.009498 4724 generic.go:334] "Generic (PLEG): container finished" podID="61201eac-406d-4bed-a59f-aa4fe87eebca" containerID="b3afe64dcae51fbf8c688cc381f51f8d0067e5ba214af2817796182d5dabe055" exitCode=0 Oct 02 13:12:39 crc kubenswrapper[4724]: I1002 13:12:39.009782 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ec20a04ef7278338c96ca90950ec47944973b8553e1da5c6f2ce730402cv6v2" event={"ID":"61201eac-406d-4bed-a59f-aa4fe87eebca","Type":"ContainerDied","Data":"b3afe64dcae51fbf8c688cc381f51f8d0067e5ba214af2817796182d5dabe055"} Oct 02 13:12:40 crc kubenswrapper[4724]: I1002 13:12:40.018132 4724 generic.go:334] "Generic (PLEG): container finished" podID="61201eac-406d-4bed-a59f-aa4fe87eebca" containerID="ea952f1dbe76e4f67a5ba1f71d9f7d5fe44ada202a4f69ceb26b9f0108029f4d" exitCode=0 Oct 02 13:12:40 crc kubenswrapper[4724]: I1002 13:12:40.018224 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ec20a04ef7278338c96ca90950ec47944973b8553e1da5c6f2ce730402cv6v2" event={"ID":"61201eac-406d-4bed-a59f-aa4fe87eebca","Type":"ContainerDied","Data":"ea952f1dbe76e4f67a5ba1f71d9f7d5fe44ada202a4f69ceb26b9f0108029f4d"} Oct 02 13:12:41 crc kubenswrapper[4724]: I1002 13:12:41.285990 4724 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ec20a04ef7278338c96ca90950ec47944973b8553e1da5c6f2ce730402cv6v2" Oct 02 13:12:41 crc kubenswrapper[4724]: I1002 13:12:41.433171 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/61201eac-406d-4bed-a59f-aa4fe87eebca-bundle\") pod \"61201eac-406d-4bed-a59f-aa4fe87eebca\" (UID: \"61201eac-406d-4bed-a59f-aa4fe87eebca\") " Oct 02 13:12:41 crc kubenswrapper[4724]: I1002 13:12:41.433275 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t6kpl\" (UniqueName: \"kubernetes.io/projected/61201eac-406d-4bed-a59f-aa4fe87eebca-kube-api-access-t6kpl\") pod \"61201eac-406d-4bed-a59f-aa4fe87eebca\" (UID: \"61201eac-406d-4bed-a59f-aa4fe87eebca\") " Oct 02 13:12:41 crc kubenswrapper[4724]: I1002 13:12:41.433333 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/61201eac-406d-4bed-a59f-aa4fe87eebca-util\") pod \"61201eac-406d-4bed-a59f-aa4fe87eebca\" (UID: \"61201eac-406d-4bed-a59f-aa4fe87eebca\") " Oct 02 13:12:41 crc kubenswrapper[4724]: I1002 13:12:41.434314 4724 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/61201eac-406d-4bed-a59f-aa4fe87eebca-bundle" (OuterVolumeSpecName: "bundle") pod "61201eac-406d-4bed-a59f-aa4fe87eebca" (UID: "61201eac-406d-4bed-a59f-aa4fe87eebca"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 13:12:41 crc kubenswrapper[4724]: I1002 13:12:41.441430 4724 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/61201eac-406d-4bed-a59f-aa4fe87eebca-kube-api-access-t6kpl" (OuterVolumeSpecName: "kube-api-access-t6kpl") pod "61201eac-406d-4bed-a59f-aa4fe87eebca" (UID: "61201eac-406d-4bed-a59f-aa4fe87eebca"). InnerVolumeSpecName "kube-api-access-t6kpl". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 13:12:41 crc kubenswrapper[4724]: I1002 13:12:41.448614 4724 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/61201eac-406d-4bed-a59f-aa4fe87eebca-util" (OuterVolumeSpecName: "util") pod "61201eac-406d-4bed-a59f-aa4fe87eebca" (UID: "61201eac-406d-4bed-a59f-aa4fe87eebca"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 13:12:41 crc kubenswrapper[4724]: I1002 13:12:41.535315 4724 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/61201eac-406d-4bed-a59f-aa4fe87eebca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 13:12:41 crc kubenswrapper[4724]: I1002 13:12:41.535364 4724 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t6kpl\" (UniqueName: \"kubernetes.io/projected/61201eac-406d-4bed-a59f-aa4fe87eebca-kube-api-access-t6kpl\") on node \"crc\" DevicePath \"\"" Oct 02 13:12:41 crc kubenswrapper[4724]: I1002 13:12:41.535374 4724 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/61201eac-406d-4bed-a59f-aa4fe87eebca-util\") on node \"crc\" DevicePath \"\"" Oct 02 13:12:42 crc kubenswrapper[4724]: I1002 13:12:42.034667 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ec20a04ef7278338c96ca90950ec47944973b8553e1da5c6f2ce730402cv6v2" event={"ID":"61201eac-406d-4bed-a59f-aa4fe87eebca","Type":"ContainerDied","Data":"1389a97604d3052f02aa4c1c25c739c9fd9877145566abdda178595b78345b17"} Oct 02 13:12:42 crc kubenswrapper[4724]: I1002 13:12:42.035006 4724 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1389a97604d3052f02aa4c1c25c739c9fd9877145566abdda178595b78345b17" Oct 02 13:12:42 crc kubenswrapper[4724]: I1002 13:12:42.034708 4724 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ec20a04ef7278338c96ca90950ec47944973b8553e1da5c6f2ce730402cv6v2" Oct 02 13:12:49 crc kubenswrapper[4724]: I1002 13:12:49.387585 4724 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/infra-operator-controller-manager-75d774d5cf-hbbvj"] Oct 02 13:12:49 crc kubenswrapper[4724]: E1002 13:12:49.388410 4724 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="61201eac-406d-4bed-a59f-aa4fe87eebca" containerName="pull" Oct 02 13:12:49 crc kubenswrapper[4724]: I1002 13:12:49.388430 4724 state_mem.go:107] "Deleted CPUSet assignment" podUID="61201eac-406d-4bed-a59f-aa4fe87eebca" containerName="pull" Oct 02 13:12:49 crc kubenswrapper[4724]: E1002 13:12:49.388445 4724 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="61201eac-406d-4bed-a59f-aa4fe87eebca" containerName="extract" Oct 02 13:12:49 crc kubenswrapper[4724]: I1002 13:12:49.388453 4724 state_mem.go:107] "Deleted CPUSet assignment" podUID="61201eac-406d-4bed-a59f-aa4fe87eebca" containerName="extract" Oct 02 13:12:49 crc kubenswrapper[4724]: E1002 13:12:49.388472 4724 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="61201eac-406d-4bed-a59f-aa4fe87eebca" containerName="util" Oct 02 13:12:49 crc kubenswrapper[4724]: I1002 13:12:49.388481 4724 state_mem.go:107] "Deleted CPUSet assignment" podUID="61201eac-406d-4bed-a59f-aa4fe87eebca" containerName="util" Oct 02 13:12:49 crc kubenswrapper[4724]: I1002 13:12:49.388648 4724 memory_manager.go:354] "RemoveStaleState removing state" podUID="61201eac-406d-4bed-a59f-aa4fe87eebca" containerName="extract" Oct 02 13:12:49 crc kubenswrapper[4724]: I1002 13:12:49.389426 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-75d774d5cf-hbbvj" Oct 02 13:12:49 crc kubenswrapper[4724]: I1002 13:12:49.391434 4724 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-controller-manager-service-cert" Oct 02 13:12:49 crc kubenswrapper[4724]: I1002 13:12:49.391477 4724 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-controller-manager-dockercfg-9pj5x" Oct 02 13:12:49 crc kubenswrapper[4724]: I1002 13:12:49.410987 4724 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-75d774d5cf-hbbvj"] Oct 02 13:12:49 crc kubenswrapper[4724]: I1002 13:12:49.558036 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/d5f7ea4a-c39f-4e03-bb32-0ef1cd9f322e-webhook-cert\") pod \"infra-operator-controller-manager-75d774d5cf-hbbvj\" (UID: \"d5f7ea4a-c39f-4e03-bb32-0ef1cd9f322e\") " pod="openstack-operators/infra-operator-controller-manager-75d774d5cf-hbbvj" Oct 02 13:12:49 crc kubenswrapper[4724]: I1002 13:12:49.558088 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xbf92\" (UniqueName: \"kubernetes.io/projected/d5f7ea4a-c39f-4e03-bb32-0ef1cd9f322e-kube-api-access-xbf92\") pod \"infra-operator-controller-manager-75d774d5cf-hbbvj\" (UID: \"d5f7ea4a-c39f-4e03-bb32-0ef1cd9f322e\") " pod="openstack-operators/infra-operator-controller-manager-75d774d5cf-hbbvj" Oct 02 13:12:49 crc kubenswrapper[4724]: I1002 13:12:49.558127 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/d5f7ea4a-c39f-4e03-bb32-0ef1cd9f322e-apiservice-cert\") pod \"infra-operator-controller-manager-75d774d5cf-hbbvj\" (UID: \"d5f7ea4a-c39f-4e03-bb32-0ef1cd9f322e\") " pod="openstack-operators/infra-operator-controller-manager-75d774d5cf-hbbvj" Oct 02 13:12:49 crc kubenswrapper[4724]: I1002 13:12:49.658992 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/d5f7ea4a-c39f-4e03-bb32-0ef1cd9f322e-webhook-cert\") pod \"infra-operator-controller-manager-75d774d5cf-hbbvj\" (UID: \"d5f7ea4a-c39f-4e03-bb32-0ef1cd9f322e\") " pod="openstack-operators/infra-operator-controller-manager-75d774d5cf-hbbvj" Oct 02 13:12:49 crc kubenswrapper[4724]: I1002 13:12:49.659066 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xbf92\" (UniqueName: \"kubernetes.io/projected/d5f7ea4a-c39f-4e03-bb32-0ef1cd9f322e-kube-api-access-xbf92\") pod \"infra-operator-controller-manager-75d774d5cf-hbbvj\" (UID: \"d5f7ea4a-c39f-4e03-bb32-0ef1cd9f322e\") " pod="openstack-operators/infra-operator-controller-manager-75d774d5cf-hbbvj" Oct 02 13:12:49 crc kubenswrapper[4724]: I1002 13:12:49.659112 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/d5f7ea4a-c39f-4e03-bb32-0ef1cd9f322e-apiservice-cert\") pod \"infra-operator-controller-manager-75d774d5cf-hbbvj\" (UID: \"d5f7ea4a-c39f-4e03-bb32-0ef1cd9f322e\") " pod="openstack-operators/infra-operator-controller-manager-75d774d5cf-hbbvj" Oct 02 13:12:49 crc kubenswrapper[4724]: I1002 13:12:49.666032 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/d5f7ea4a-c39f-4e03-bb32-0ef1cd9f322e-apiservice-cert\") pod \"infra-operator-controller-manager-75d774d5cf-hbbvj\" (UID: \"d5f7ea4a-c39f-4e03-bb32-0ef1cd9f322e\") " pod="openstack-operators/infra-operator-controller-manager-75d774d5cf-hbbvj" Oct 02 13:12:49 crc kubenswrapper[4724]: I1002 13:12:49.670215 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/d5f7ea4a-c39f-4e03-bb32-0ef1cd9f322e-webhook-cert\") pod \"infra-operator-controller-manager-75d774d5cf-hbbvj\" (UID: \"d5f7ea4a-c39f-4e03-bb32-0ef1cd9f322e\") " pod="openstack-operators/infra-operator-controller-manager-75d774d5cf-hbbvj" Oct 02 13:12:49 crc kubenswrapper[4724]: I1002 13:12:49.678653 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xbf92\" (UniqueName: \"kubernetes.io/projected/d5f7ea4a-c39f-4e03-bb32-0ef1cd9f322e-kube-api-access-xbf92\") pod \"infra-operator-controller-manager-75d774d5cf-hbbvj\" (UID: \"d5f7ea4a-c39f-4e03-bb32-0ef1cd9f322e\") " pod="openstack-operators/infra-operator-controller-manager-75d774d5cf-hbbvj" Oct 02 13:12:49 crc kubenswrapper[4724]: I1002 13:12:49.710062 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-75d774d5cf-hbbvj" Oct 02 13:12:50 crc kubenswrapper[4724]: I1002 13:12:50.200965 4724 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-75d774d5cf-hbbvj"] Oct 02 13:12:50 crc kubenswrapper[4724]: W1002 13:12:50.228134 4724 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd5f7ea4a_c39f_4e03_bb32_0ef1cd9f322e.slice/crio-4063dd2825cf5f9f3c6a8d6982f71110668579955a8bcf32b2f1ef3aa38a9d69 WatchSource:0}: Error finding container 4063dd2825cf5f9f3c6a8d6982f71110668579955a8bcf32b2f1ef3aa38a9d69: Status 404 returned error can't find the container with id 4063dd2825cf5f9f3c6a8d6982f71110668579955a8bcf32b2f1ef3aa38a9d69 Oct 02 13:12:51 crc kubenswrapper[4724]: I1002 13:12:51.095322 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-75d774d5cf-hbbvj" event={"ID":"d5f7ea4a-c39f-4e03-bb32-0ef1cd9f322e","Type":"ContainerStarted","Data":"4063dd2825cf5f9f3c6a8d6982f71110668579955a8bcf32b2f1ef3aa38a9d69"} Oct 02 13:12:53 crc kubenswrapper[4724]: I1002 13:12:53.108453 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-75d774d5cf-hbbvj" event={"ID":"d5f7ea4a-c39f-4e03-bb32-0ef1cd9f322e","Type":"ContainerStarted","Data":"baa5b7756add36b003c7cdd829aa4cb58289868a8de087f2df4f41782f63aa5d"} Oct 02 13:12:53 crc kubenswrapper[4724]: I1002 13:12:53.109008 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-75d774d5cf-hbbvj" event={"ID":"d5f7ea4a-c39f-4e03-bb32-0ef1cd9f322e","Type":"ContainerStarted","Data":"589d55deb3b3fc80dd88584c821542a00e61b067e4da3f4cf81a7d7e727a1bc2"} Oct 02 13:12:53 crc kubenswrapper[4724]: I1002 13:12:53.109031 4724 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/infra-operator-controller-manager-75d774d5cf-hbbvj" Oct 02 13:12:53 crc kubenswrapper[4724]: I1002 13:12:53.126447 4724 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/infra-operator-controller-manager-75d774d5cf-hbbvj" podStartSLOduration=2.138160419 podStartE2EDuration="4.12643278s" podCreationTimestamp="2025-10-02 13:12:49 +0000 UTC" firstStartedPulling="2025-10-02 13:12:50.233334796 +0000 UTC m=+834.688093927" lastFinishedPulling="2025-10-02 13:12:52.221607167 +0000 UTC m=+836.676366288" observedRunningTime="2025-10-02 13:12:53.124452828 +0000 UTC m=+837.579211949" watchObservedRunningTime="2025-10-02 13:12:53.12643278 +0000 UTC m=+837.581191901" Oct 02 13:12:55 crc kubenswrapper[4724]: I1002 13:12:55.801082 4724 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/openstack-galera-0"] Oct 02 13:12:55 crc kubenswrapper[4724]: I1002 13:12:55.802345 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/openstack-galera-0" Oct 02 13:12:55 crc kubenswrapper[4724]: I1002 13:12:55.804185 4724 reflector.go:368] Caches populated for *v1.ConfigMap from object-"glance-kuttl-tests"/"openstack-scripts" Oct 02 13:12:55 crc kubenswrapper[4724]: I1002 13:12:55.804206 4724 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"osp-secret" Oct 02 13:12:55 crc kubenswrapper[4724]: I1002 13:12:55.804691 4724 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"galera-openstack-dockercfg-p2jsj" Oct 02 13:12:55 crc kubenswrapper[4724]: I1002 13:12:55.804817 4724 reflector.go:368] Caches populated for *v1.ConfigMap from object-"glance-kuttl-tests"/"kube-root-ca.crt" Oct 02 13:12:55 crc kubenswrapper[4724]: I1002 13:12:55.804858 4724 reflector.go:368] Caches populated for *v1.ConfigMap from object-"glance-kuttl-tests"/"openstack-config-data" Oct 02 13:12:55 crc kubenswrapper[4724]: I1002 13:12:55.804944 4724 reflector.go:368] Caches populated for *v1.ConfigMap from object-"glance-kuttl-tests"/"openshift-service-ca.crt" Oct 02 13:12:55 crc kubenswrapper[4724]: I1002 13:12:55.817495 4724 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/openstack-galera-0"] Oct 02 13:12:55 crc kubenswrapper[4724]: I1002 13:12:55.831776 4724 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/openstack-galera-2"] Oct 02 13:12:55 crc kubenswrapper[4724]: I1002 13:12:55.833006 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/openstack-galera-2" Oct 02 13:12:55 crc kubenswrapper[4724]: I1002 13:12:55.837876 4724 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/openstack-galera-1"] Oct 02 13:12:55 crc kubenswrapper[4724]: I1002 13:12:55.839026 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/openstack-galera-1" Oct 02 13:12:55 crc kubenswrapper[4724]: I1002 13:12:55.845974 4724 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/openstack-galera-2"] Oct 02 13:12:55 crc kubenswrapper[4724]: I1002 13:12:55.854097 4724 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/openstack-galera-1"] Oct 02 13:12:55 crc kubenswrapper[4724]: I1002 13:12:55.937381 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/fa83e829-3df9-425f-91ba-2271f0c201ab-config-data-default\") pod \"openstack-galera-0\" (UID: \"fa83e829-3df9-425f-91ba-2271f0c201ab\") " pod="glance-kuttl-tests/openstack-galera-0" Oct 02 13:12:55 crc kubenswrapper[4724]: I1002 13:12:55.937425 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f7tpb\" (UniqueName: \"kubernetes.io/projected/fa83e829-3df9-425f-91ba-2271f0c201ab-kube-api-access-f7tpb\") pod \"openstack-galera-0\" (UID: \"fa83e829-3df9-425f-91ba-2271f0c201ab\") " pod="glance-kuttl-tests/openstack-galera-0" Oct 02 13:12:55 crc kubenswrapper[4724]: I1002 13:12:55.937457 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"openstack-galera-2\" (UID: \"dae47a84-6fb7-42c6-ad5b-415ca57924b3\") " pod="glance-kuttl-tests/openstack-galera-2" Oct 02 13:12:55 crc kubenswrapper[4724]: I1002 13:12:55.937474 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kfstm\" (UniqueName: \"kubernetes.io/projected/dae47a84-6fb7-42c6-ad5b-415ca57924b3-kube-api-access-kfstm\") pod \"openstack-galera-2\" (UID: \"dae47a84-6fb7-42c6-ad5b-415ca57924b3\") " pod="glance-kuttl-tests/openstack-galera-2" Oct 02 13:12:55 crc kubenswrapper[4724]: I1002 13:12:55.937497 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/fa83e829-3df9-425f-91ba-2271f0c201ab-config-data-generated\") pod \"openstack-galera-0\" (UID: \"fa83e829-3df9-425f-91ba-2271f0c201ab\") " pod="glance-kuttl-tests/openstack-galera-0" Oct 02 13:12:55 crc kubenswrapper[4724]: I1002 13:12:55.937552 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/dae47a84-6fb7-42c6-ad5b-415ca57924b3-secrets\") pod \"openstack-galera-2\" (UID: \"dae47a84-6fb7-42c6-ad5b-415ca57924b3\") " pod="glance-kuttl-tests/openstack-galera-2" Oct 02 13:12:55 crc kubenswrapper[4724]: I1002 13:12:55.937682 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/fa83e829-3df9-425f-91ba-2271f0c201ab-kolla-config\") pod \"openstack-galera-0\" (UID: \"fa83e829-3df9-425f-91ba-2271f0c201ab\") " pod="glance-kuttl-tests/openstack-galera-0" Oct 02 13:12:55 crc kubenswrapper[4724]: I1002 13:12:55.937762 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"openstack-galera-0\" (UID: \"fa83e829-3df9-425f-91ba-2271f0c201ab\") " pod="glance-kuttl-tests/openstack-galera-0" Oct 02 13:12:55 crc kubenswrapper[4724]: I1002 13:12:55.937810 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/dae47a84-6fb7-42c6-ad5b-415ca57924b3-kolla-config\") pod \"openstack-galera-2\" (UID: \"dae47a84-6fb7-42c6-ad5b-415ca57924b3\") " pod="glance-kuttl-tests/openstack-galera-2" Oct 02 13:12:55 crc kubenswrapper[4724]: I1002 13:12:55.937827 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/dae47a84-6fb7-42c6-ad5b-415ca57924b3-operator-scripts\") pod \"openstack-galera-2\" (UID: \"dae47a84-6fb7-42c6-ad5b-415ca57924b3\") " pod="glance-kuttl-tests/openstack-galera-2" Oct 02 13:12:55 crc kubenswrapper[4724]: I1002 13:12:55.937878 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/fa83e829-3df9-425f-91ba-2271f0c201ab-secrets\") pod \"openstack-galera-0\" (UID: \"fa83e829-3df9-425f-91ba-2271f0c201ab\") " pod="glance-kuttl-tests/openstack-galera-0" Oct 02 13:12:55 crc kubenswrapper[4724]: I1002 13:12:55.937896 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fa83e829-3df9-425f-91ba-2271f0c201ab-operator-scripts\") pod \"openstack-galera-0\" (UID: \"fa83e829-3df9-425f-91ba-2271f0c201ab\") " pod="glance-kuttl-tests/openstack-galera-0" Oct 02 13:12:55 crc kubenswrapper[4724]: I1002 13:12:55.937927 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/dae47a84-6fb7-42c6-ad5b-415ca57924b3-config-data-generated\") pod \"openstack-galera-2\" (UID: \"dae47a84-6fb7-42c6-ad5b-415ca57924b3\") " pod="glance-kuttl-tests/openstack-galera-2" Oct 02 13:12:55 crc kubenswrapper[4724]: I1002 13:12:55.937993 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/dae47a84-6fb7-42c6-ad5b-415ca57924b3-config-data-default\") pod \"openstack-galera-2\" (UID: \"dae47a84-6fb7-42c6-ad5b-415ca57924b3\") " pod="glance-kuttl-tests/openstack-galera-2" Oct 02 13:12:56 crc kubenswrapper[4724]: I1002 13:12:56.039940 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2d30088c-495e-4cd7-892b-f33848b4d5be-operator-scripts\") pod \"openstack-galera-1\" (UID: \"2d30088c-495e-4cd7-892b-f33848b4d5be\") " pod="glance-kuttl-tests/openstack-galera-1" Oct 02 13:12:56 crc kubenswrapper[4724]: I1002 13:12:56.040009 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/dae47a84-6fb7-42c6-ad5b-415ca57924b3-config-data-generated\") pod \"openstack-galera-2\" (UID: \"dae47a84-6fb7-42c6-ad5b-415ca57924b3\") " pod="glance-kuttl-tests/openstack-galera-2" Oct 02 13:12:56 crc kubenswrapper[4724]: I1002 13:12:56.040038 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/dae47a84-6fb7-42c6-ad5b-415ca57924b3-config-data-default\") pod \"openstack-galera-2\" (UID: \"dae47a84-6fb7-42c6-ad5b-415ca57924b3\") " pod="glance-kuttl-tests/openstack-galera-2" Oct 02 13:12:56 crc kubenswrapper[4724]: I1002 13:12:56.040066 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/2d30088c-495e-4cd7-892b-f33848b4d5be-config-data-default\") pod \"openstack-galera-1\" (UID: \"2d30088c-495e-4cd7-892b-f33848b4d5be\") " pod="glance-kuttl-tests/openstack-galera-1" Oct 02 13:12:56 crc kubenswrapper[4724]: I1002 13:12:56.040103 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/2d30088c-495e-4cd7-892b-f33848b4d5be-secrets\") pod \"openstack-galera-1\" (UID: \"2d30088c-495e-4cd7-892b-f33848b4d5be\") " pod="glance-kuttl-tests/openstack-galera-1" Oct 02 13:12:56 crc kubenswrapper[4724]: I1002 13:12:56.040124 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pgnzs\" (UniqueName: \"kubernetes.io/projected/2d30088c-495e-4cd7-892b-f33848b4d5be-kube-api-access-pgnzs\") pod \"openstack-galera-1\" (UID: \"2d30088c-495e-4cd7-892b-f33848b4d5be\") " pod="glance-kuttl-tests/openstack-galera-1" Oct 02 13:12:56 crc kubenswrapper[4724]: I1002 13:12:56.040158 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/2d30088c-495e-4cd7-892b-f33848b4d5be-config-data-generated\") pod \"openstack-galera-1\" (UID: \"2d30088c-495e-4cd7-892b-f33848b4d5be\") " pod="glance-kuttl-tests/openstack-galera-1" Oct 02 13:12:56 crc kubenswrapper[4724]: I1002 13:12:56.040181 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/fa83e829-3df9-425f-91ba-2271f0c201ab-config-data-default\") pod \"openstack-galera-0\" (UID: \"fa83e829-3df9-425f-91ba-2271f0c201ab\") " pod="glance-kuttl-tests/openstack-galera-0" Oct 02 13:12:56 crc kubenswrapper[4724]: I1002 13:12:56.040202 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f7tpb\" (UniqueName: \"kubernetes.io/projected/fa83e829-3df9-425f-91ba-2271f0c201ab-kube-api-access-f7tpb\") pod \"openstack-galera-0\" (UID: \"fa83e829-3df9-425f-91ba-2271f0c201ab\") " pod="glance-kuttl-tests/openstack-galera-0" Oct 02 13:12:56 crc kubenswrapper[4724]: I1002 13:12:56.040234 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kfstm\" (UniqueName: \"kubernetes.io/projected/dae47a84-6fb7-42c6-ad5b-415ca57924b3-kube-api-access-kfstm\") pod \"openstack-galera-2\" (UID: \"dae47a84-6fb7-42c6-ad5b-415ca57924b3\") " pod="glance-kuttl-tests/openstack-galera-2" Oct 02 13:12:56 crc kubenswrapper[4724]: I1002 13:12:56.040258 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"openstack-galera-2\" (UID: \"dae47a84-6fb7-42c6-ad5b-415ca57924b3\") " pod="glance-kuttl-tests/openstack-galera-2" Oct 02 13:12:56 crc kubenswrapper[4724]: I1002 13:12:56.040289 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/fa83e829-3df9-425f-91ba-2271f0c201ab-config-data-generated\") pod \"openstack-galera-0\" (UID: \"fa83e829-3df9-425f-91ba-2271f0c201ab\") " pod="glance-kuttl-tests/openstack-galera-0" Oct 02 13:12:56 crc kubenswrapper[4724]: I1002 13:12:56.040318 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/dae47a84-6fb7-42c6-ad5b-415ca57924b3-secrets\") pod \"openstack-galera-2\" (UID: \"dae47a84-6fb7-42c6-ad5b-415ca57924b3\") " pod="glance-kuttl-tests/openstack-galera-2" Oct 02 13:12:56 crc kubenswrapper[4724]: I1002 13:12:56.040338 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/fa83e829-3df9-425f-91ba-2271f0c201ab-kolla-config\") pod \"openstack-galera-0\" (UID: \"fa83e829-3df9-425f-91ba-2271f0c201ab\") " pod="glance-kuttl-tests/openstack-galera-0" Oct 02 13:12:56 crc kubenswrapper[4724]: I1002 13:12:56.040361 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"openstack-galera-1\" (UID: \"2d30088c-495e-4cd7-892b-f33848b4d5be\") " pod="glance-kuttl-tests/openstack-galera-1" Oct 02 13:12:56 crc kubenswrapper[4724]: I1002 13:12:56.040381 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"openstack-galera-0\" (UID: \"fa83e829-3df9-425f-91ba-2271f0c201ab\") " pod="glance-kuttl-tests/openstack-galera-0" Oct 02 13:12:56 crc kubenswrapper[4724]: I1002 13:12:56.040409 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/2d30088c-495e-4cd7-892b-f33848b4d5be-kolla-config\") pod \"openstack-galera-1\" (UID: \"2d30088c-495e-4cd7-892b-f33848b4d5be\") " pod="glance-kuttl-tests/openstack-galera-1" Oct 02 13:12:56 crc kubenswrapper[4724]: I1002 13:12:56.040429 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/dae47a84-6fb7-42c6-ad5b-415ca57924b3-kolla-config\") pod \"openstack-galera-2\" (UID: \"dae47a84-6fb7-42c6-ad5b-415ca57924b3\") " pod="glance-kuttl-tests/openstack-galera-2" Oct 02 13:12:56 crc kubenswrapper[4724]: I1002 13:12:56.040454 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/dae47a84-6fb7-42c6-ad5b-415ca57924b3-operator-scripts\") pod \"openstack-galera-2\" (UID: \"dae47a84-6fb7-42c6-ad5b-415ca57924b3\") " pod="glance-kuttl-tests/openstack-galera-2" Oct 02 13:12:56 crc kubenswrapper[4724]: I1002 13:12:56.040480 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/fa83e829-3df9-425f-91ba-2271f0c201ab-secrets\") pod \"openstack-galera-0\" (UID: \"fa83e829-3df9-425f-91ba-2271f0c201ab\") " pod="glance-kuttl-tests/openstack-galera-0" Oct 02 13:12:56 crc kubenswrapper[4724]: I1002 13:12:56.040484 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/dae47a84-6fb7-42c6-ad5b-415ca57924b3-config-data-generated\") pod \"openstack-galera-2\" (UID: \"dae47a84-6fb7-42c6-ad5b-415ca57924b3\") " pod="glance-kuttl-tests/openstack-galera-2" Oct 02 13:12:56 crc kubenswrapper[4724]: I1002 13:12:56.040502 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fa83e829-3df9-425f-91ba-2271f0c201ab-operator-scripts\") pod \"openstack-galera-0\" (UID: \"fa83e829-3df9-425f-91ba-2271f0c201ab\") " pod="glance-kuttl-tests/openstack-galera-0" Oct 02 13:12:56 crc kubenswrapper[4724]: I1002 13:12:56.040684 4724 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"openstack-galera-2\" (UID: \"dae47a84-6fb7-42c6-ad5b-415ca57924b3\") device mount path \"/mnt/openstack/pv02\"" pod="glance-kuttl-tests/openstack-galera-2" Oct 02 13:12:56 crc kubenswrapper[4724]: I1002 13:12:56.040744 4724 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"openstack-galera-0\" (UID: \"fa83e829-3df9-425f-91ba-2271f0c201ab\") device mount path \"/mnt/openstack/pv03\"" pod="glance-kuttl-tests/openstack-galera-0" Oct 02 13:12:56 crc kubenswrapper[4724]: I1002 13:12:56.041449 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/dae47a84-6fb7-42c6-ad5b-415ca57924b3-config-data-default\") pod \"openstack-galera-2\" (UID: \"dae47a84-6fb7-42c6-ad5b-415ca57924b3\") " pod="glance-kuttl-tests/openstack-galera-2" Oct 02 13:12:56 crc kubenswrapper[4724]: I1002 13:12:56.041699 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/fa83e829-3df9-425f-91ba-2271f0c201ab-config-data-default\") pod \"openstack-galera-0\" (UID: \"fa83e829-3df9-425f-91ba-2271f0c201ab\") " pod="glance-kuttl-tests/openstack-galera-0" Oct 02 13:12:56 crc kubenswrapper[4724]: I1002 13:12:56.042469 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/fa83e829-3df9-425f-91ba-2271f0c201ab-kolla-config\") pod \"openstack-galera-0\" (UID: \"fa83e829-3df9-425f-91ba-2271f0c201ab\") " pod="glance-kuttl-tests/openstack-galera-0" Oct 02 13:12:56 crc kubenswrapper[4724]: I1002 13:12:56.042485 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/dae47a84-6fb7-42c6-ad5b-415ca57924b3-kolla-config\") pod \"openstack-galera-2\" (UID: \"dae47a84-6fb7-42c6-ad5b-415ca57924b3\") " pod="glance-kuttl-tests/openstack-galera-2" Oct 02 13:12:56 crc kubenswrapper[4724]: I1002 13:12:56.042549 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fa83e829-3df9-425f-91ba-2271f0c201ab-operator-scripts\") pod \"openstack-galera-0\" (UID: \"fa83e829-3df9-425f-91ba-2271f0c201ab\") " pod="glance-kuttl-tests/openstack-galera-0" Oct 02 13:12:56 crc kubenswrapper[4724]: I1002 13:12:56.042819 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/fa83e829-3df9-425f-91ba-2271f0c201ab-config-data-generated\") pod \"openstack-galera-0\" (UID: \"fa83e829-3df9-425f-91ba-2271f0c201ab\") " pod="glance-kuttl-tests/openstack-galera-0" Oct 02 13:12:56 crc kubenswrapper[4724]: I1002 13:12:56.043671 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/dae47a84-6fb7-42c6-ad5b-415ca57924b3-operator-scripts\") pod \"openstack-galera-2\" (UID: \"dae47a84-6fb7-42c6-ad5b-415ca57924b3\") " pod="glance-kuttl-tests/openstack-galera-2" Oct 02 13:12:56 crc kubenswrapper[4724]: I1002 13:12:56.049456 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/fa83e829-3df9-425f-91ba-2271f0c201ab-secrets\") pod \"openstack-galera-0\" (UID: \"fa83e829-3df9-425f-91ba-2271f0c201ab\") " pod="glance-kuttl-tests/openstack-galera-0" Oct 02 13:12:56 crc kubenswrapper[4724]: I1002 13:12:56.060387 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"openstack-galera-0\" (UID: \"fa83e829-3df9-425f-91ba-2271f0c201ab\") " pod="glance-kuttl-tests/openstack-galera-0" Oct 02 13:12:56 crc kubenswrapper[4724]: I1002 13:12:56.062830 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/dae47a84-6fb7-42c6-ad5b-415ca57924b3-secrets\") pod \"openstack-galera-2\" (UID: \"dae47a84-6fb7-42c6-ad5b-415ca57924b3\") " pod="glance-kuttl-tests/openstack-galera-2" Oct 02 13:12:56 crc kubenswrapper[4724]: I1002 13:12:56.063018 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"openstack-galera-2\" (UID: \"dae47a84-6fb7-42c6-ad5b-415ca57924b3\") " pod="glance-kuttl-tests/openstack-galera-2" Oct 02 13:12:56 crc kubenswrapper[4724]: I1002 13:12:56.063490 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kfstm\" (UniqueName: \"kubernetes.io/projected/dae47a84-6fb7-42c6-ad5b-415ca57924b3-kube-api-access-kfstm\") pod \"openstack-galera-2\" (UID: \"dae47a84-6fb7-42c6-ad5b-415ca57924b3\") " pod="glance-kuttl-tests/openstack-galera-2" Oct 02 13:12:56 crc kubenswrapper[4724]: I1002 13:12:56.067059 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f7tpb\" (UniqueName: \"kubernetes.io/projected/fa83e829-3df9-425f-91ba-2271f0c201ab-kube-api-access-f7tpb\") pod \"openstack-galera-0\" (UID: \"fa83e829-3df9-425f-91ba-2271f0c201ab\") " pod="glance-kuttl-tests/openstack-galera-0" Oct 02 13:12:56 crc kubenswrapper[4724]: I1002 13:12:56.141879 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"openstack-galera-1\" (UID: \"2d30088c-495e-4cd7-892b-f33848b4d5be\") " pod="glance-kuttl-tests/openstack-galera-1" Oct 02 13:12:56 crc kubenswrapper[4724]: I1002 13:12:56.141948 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/2d30088c-495e-4cd7-892b-f33848b4d5be-kolla-config\") pod \"openstack-galera-1\" (UID: \"2d30088c-495e-4cd7-892b-f33848b4d5be\") " pod="glance-kuttl-tests/openstack-galera-1" Oct 02 13:12:56 crc kubenswrapper[4724]: I1002 13:12:56.141986 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2d30088c-495e-4cd7-892b-f33848b4d5be-operator-scripts\") pod \"openstack-galera-1\" (UID: \"2d30088c-495e-4cd7-892b-f33848b4d5be\") " pod="glance-kuttl-tests/openstack-galera-1" Oct 02 13:12:56 crc kubenswrapper[4724]: I1002 13:12:56.142012 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/2d30088c-495e-4cd7-892b-f33848b4d5be-config-data-default\") pod \"openstack-galera-1\" (UID: \"2d30088c-495e-4cd7-892b-f33848b4d5be\") " pod="glance-kuttl-tests/openstack-galera-1" Oct 02 13:12:56 crc kubenswrapper[4724]: I1002 13:12:56.142035 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/2d30088c-495e-4cd7-892b-f33848b4d5be-secrets\") pod \"openstack-galera-1\" (UID: \"2d30088c-495e-4cd7-892b-f33848b4d5be\") " pod="glance-kuttl-tests/openstack-galera-1" Oct 02 13:12:56 crc kubenswrapper[4724]: I1002 13:12:56.142050 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pgnzs\" (UniqueName: \"kubernetes.io/projected/2d30088c-495e-4cd7-892b-f33848b4d5be-kube-api-access-pgnzs\") pod \"openstack-galera-1\" (UID: \"2d30088c-495e-4cd7-892b-f33848b4d5be\") " pod="glance-kuttl-tests/openstack-galera-1" Oct 02 13:12:56 crc kubenswrapper[4724]: I1002 13:12:56.142071 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/2d30088c-495e-4cd7-892b-f33848b4d5be-config-data-generated\") pod \"openstack-galera-1\" (UID: \"2d30088c-495e-4cd7-892b-f33848b4d5be\") " pod="glance-kuttl-tests/openstack-galera-1" Oct 02 13:12:56 crc kubenswrapper[4724]: I1002 13:12:56.142087 4724 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"openstack-galera-1\" (UID: \"2d30088c-495e-4cd7-892b-f33848b4d5be\") device mount path \"/mnt/openstack/pv11\"" pod="glance-kuttl-tests/openstack-galera-1" Oct 02 13:12:56 crc kubenswrapper[4724]: I1002 13:12:56.142958 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/2d30088c-495e-4cd7-892b-f33848b4d5be-config-data-generated\") pod \"openstack-galera-1\" (UID: \"2d30088c-495e-4cd7-892b-f33848b4d5be\") " pod="glance-kuttl-tests/openstack-galera-1" Oct 02 13:12:56 crc kubenswrapper[4724]: I1002 13:12:56.143039 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/2d30088c-495e-4cd7-892b-f33848b4d5be-config-data-default\") pod \"openstack-galera-1\" (UID: \"2d30088c-495e-4cd7-892b-f33848b4d5be\") " pod="glance-kuttl-tests/openstack-galera-1" Oct 02 13:12:56 crc kubenswrapper[4724]: I1002 13:12:56.143307 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/2d30088c-495e-4cd7-892b-f33848b4d5be-kolla-config\") pod \"openstack-galera-1\" (UID: \"2d30088c-495e-4cd7-892b-f33848b4d5be\") " pod="glance-kuttl-tests/openstack-galera-1" Oct 02 13:12:56 crc kubenswrapper[4724]: I1002 13:12:56.143452 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2d30088c-495e-4cd7-892b-f33848b4d5be-operator-scripts\") pod \"openstack-galera-1\" (UID: \"2d30088c-495e-4cd7-892b-f33848b4d5be\") " pod="glance-kuttl-tests/openstack-galera-1" Oct 02 13:12:56 crc kubenswrapper[4724]: I1002 13:12:56.144999 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/2d30088c-495e-4cd7-892b-f33848b4d5be-secrets\") pod \"openstack-galera-1\" (UID: \"2d30088c-495e-4cd7-892b-f33848b4d5be\") " pod="glance-kuttl-tests/openstack-galera-1" Oct 02 13:12:56 crc kubenswrapper[4724]: I1002 13:12:56.158475 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"openstack-galera-1\" (UID: \"2d30088c-495e-4cd7-892b-f33848b4d5be\") " pod="glance-kuttl-tests/openstack-galera-1" Oct 02 13:12:56 crc kubenswrapper[4724]: I1002 13:12:56.160471 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pgnzs\" (UniqueName: \"kubernetes.io/projected/2d30088c-495e-4cd7-892b-f33848b4d5be-kube-api-access-pgnzs\") pod \"openstack-galera-1\" (UID: \"2d30088c-495e-4cd7-892b-f33848b4d5be\") " pod="glance-kuttl-tests/openstack-galera-1" Oct 02 13:12:56 crc kubenswrapper[4724]: I1002 13:12:56.169715 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/openstack-galera-0" Oct 02 13:12:56 crc kubenswrapper[4724]: I1002 13:12:56.178365 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/openstack-galera-2" Oct 02 13:12:56 crc kubenswrapper[4724]: I1002 13:12:56.188745 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/openstack-galera-1" Oct 02 13:12:56 crc kubenswrapper[4724]: I1002 13:12:56.582797 4724 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/openstack-galera-2"] Oct 02 13:12:56 crc kubenswrapper[4724]: W1002 13:12:56.586307 4724 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddae47a84_6fb7_42c6_ad5b_415ca57924b3.slice/crio-4170e9b309dadfa094f62ca7f19c0991f68f58ac8a89725f1ee2d78829cba540 WatchSource:0}: Error finding container 4170e9b309dadfa094f62ca7f19c0991f68f58ac8a89725f1ee2d78829cba540: Status 404 returned error can't find the container with id 4170e9b309dadfa094f62ca7f19c0991f68f58ac8a89725f1ee2d78829cba540 Oct 02 13:12:56 crc kubenswrapper[4724]: I1002 13:12:56.587045 4724 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/openstack-galera-0"] Oct 02 13:12:56 crc kubenswrapper[4724]: W1002 13:12:56.587282 4724 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfa83e829_3df9_425f_91ba_2271f0c201ab.slice/crio-b086add25eedaae96fb602943617f2e01ff1fd2e3e08d22431d489347e3f5eed WatchSource:0}: Error finding container b086add25eedaae96fb602943617f2e01ff1fd2e3e08d22431d489347e3f5eed: Status 404 returned error can't find the container with id b086add25eedaae96fb602943617f2e01ff1fd2e3e08d22431d489347e3f5eed Oct 02 13:12:56 crc kubenswrapper[4724]: I1002 13:12:56.637768 4724 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/openstack-galera-1"] Oct 02 13:12:56 crc kubenswrapper[4724]: W1002 13:12:56.649726 4724 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2d30088c_495e_4cd7_892b_f33848b4d5be.slice/crio-4973befdafb46cf21b1c8b8f09dd35a02abcc9070d16ac39c6a67b6c6de41065 WatchSource:0}: Error finding container 4973befdafb46cf21b1c8b8f09dd35a02abcc9070d16ac39c6a67b6c6de41065: Status 404 returned error can't find the container with id 4973befdafb46cf21b1c8b8f09dd35a02abcc9070d16ac39c6a67b6c6de41065 Oct 02 13:12:57 crc kubenswrapper[4724]: I1002 13:12:57.057127 4724 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-dm9x4"] Oct 02 13:12:57 crc kubenswrapper[4724]: I1002 13:12:57.058580 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-dm9x4" Oct 02 13:12:57 crc kubenswrapper[4724]: I1002 13:12:57.075752 4724 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-dm9x4"] Oct 02 13:12:57 crc kubenswrapper[4724]: I1002 13:12:57.133875 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/openstack-galera-2" event={"ID":"dae47a84-6fb7-42c6-ad5b-415ca57924b3","Type":"ContainerStarted","Data":"4170e9b309dadfa094f62ca7f19c0991f68f58ac8a89725f1ee2d78829cba540"} Oct 02 13:12:57 crc kubenswrapper[4724]: I1002 13:12:57.135314 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/openstack-galera-0" event={"ID":"fa83e829-3df9-425f-91ba-2271f0c201ab","Type":"ContainerStarted","Data":"b086add25eedaae96fb602943617f2e01ff1fd2e3e08d22431d489347e3f5eed"} Oct 02 13:12:57 crc kubenswrapper[4724]: I1002 13:12:57.136551 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/openstack-galera-1" event={"ID":"2d30088c-495e-4cd7-892b-f33848b4d5be","Type":"ContainerStarted","Data":"4973befdafb46cf21b1c8b8f09dd35a02abcc9070d16ac39c6a67b6c6de41065"} Oct 02 13:12:57 crc kubenswrapper[4724]: I1002 13:12:57.158548 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/63560a5f-9ae8-4fb0-8f06-4f5c1550c25e-catalog-content\") pod \"certified-operators-dm9x4\" (UID: \"63560a5f-9ae8-4fb0-8f06-4f5c1550c25e\") " pod="openshift-marketplace/certified-operators-dm9x4" Oct 02 13:12:57 crc kubenswrapper[4724]: I1002 13:12:57.158627 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hvdf7\" (UniqueName: \"kubernetes.io/projected/63560a5f-9ae8-4fb0-8f06-4f5c1550c25e-kube-api-access-hvdf7\") pod \"certified-operators-dm9x4\" (UID: \"63560a5f-9ae8-4fb0-8f06-4f5c1550c25e\") " pod="openshift-marketplace/certified-operators-dm9x4" Oct 02 13:12:57 crc kubenswrapper[4724]: I1002 13:12:57.158672 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/63560a5f-9ae8-4fb0-8f06-4f5c1550c25e-utilities\") pod \"certified-operators-dm9x4\" (UID: \"63560a5f-9ae8-4fb0-8f06-4f5c1550c25e\") " pod="openshift-marketplace/certified-operators-dm9x4" Oct 02 13:12:57 crc kubenswrapper[4724]: I1002 13:12:57.260739 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hvdf7\" (UniqueName: \"kubernetes.io/projected/63560a5f-9ae8-4fb0-8f06-4f5c1550c25e-kube-api-access-hvdf7\") pod \"certified-operators-dm9x4\" (UID: \"63560a5f-9ae8-4fb0-8f06-4f5c1550c25e\") " pod="openshift-marketplace/certified-operators-dm9x4" Oct 02 13:12:57 crc kubenswrapper[4724]: I1002 13:12:57.260858 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/63560a5f-9ae8-4fb0-8f06-4f5c1550c25e-utilities\") pod \"certified-operators-dm9x4\" (UID: \"63560a5f-9ae8-4fb0-8f06-4f5c1550c25e\") " pod="openshift-marketplace/certified-operators-dm9x4" Oct 02 13:12:57 crc kubenswrapper[4724]: I1002 13:12:57.260939 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/63560a5f-9ae8-4fb0-8f06-4f5c1550c25e-catalog-content\") pod \"certified-operators-dm9x4\" (UID: \"63560a5f-9ae8-4fb0-8f06-4f5c1550c25e\") " pod="openshift-marketplace/certified-operators-dm9x4" Oct 02 13:12:57 crc kubenswrapper[4724]: I1002 13:12:57.261893 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/63560a5f-9ae8-4fb0-8f06-4f5c1550c25e-catalog-content\") pod \"certified-operators-dm9x4\" (UID: \"63560a5f-9ae8-4fb0-8f06-4f5c1550c25e\") " pod="openshift-marketplace/certified-operators-dm9x4" Oct 02 13:12:57 crc kubenswrapper[4724]: I1002 13:12:57.263754 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/63560a5f-9ae8-4fb0-8f06-4f5c1550c25e-utilities\") pod \"certified-operators-dm9x4\" (UID: \"63560a5f-9ae8-4fb0-8f06-4f5c1550c25e\") " pod="openshift-marketplace/certified-operators-dm9x4" Oct 02 13:12:57 crc kubenswrapper[4724]: I1002 13:12:57.293880 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hvdf7\" (UniqueName: \"kubernetes.io/projected/63560a5f-9ae8-4fb0-8f06-4f5c1550c25e-kube-api-access-hvdf7\") pod \"certified-operators-dm9x4\" (UID: \"63560a5f-9ae8-4fb0-8f06-4f5c1550c25e\") " pod="openshift-marketplace/certified-operators-dm9x4" Oct 02 13:12:57 crc kubenswrapper[4724]: I1002 13:12:57.407786 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-dm9x4" Oct 02 13:12:57 crc kubenswrapper[4724]: I1002 13:12:57.784942 4724 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-dm9x4"] Oct 02 13:12:58 crc kubenswrapper[4724]: I1002 13:12:58.147864 4724 generic.go:334] "Generic (PLEG): container finished" podID="63560a5f-9ae8-4fb0-8f06-4f5c1550c25e" containerID="56fa8c718a5f92e7c90e27de47034a569e764ef994d9cfba075bef9dac044585" exitCode=0 Oct 02 13:12:58 crc kubenswrapper[4724]: I1002 13:12:58.148071 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dm9x4" event={"ID":"63560a5f-9ae8-4fb0-8f06-4f5c1550c25e","Type":"ContainerDied","Data":"56fa8c718a5f92e7c90e27de47034a569e764ef994d9cfba075bef9dac044585"} Oct 02 13:12:58 crc kubenswrapper[4724]: I1002 13:12:58.148099 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dm9x4" event={"ID":"63560a5f-9ae8-4fb0-8f06-4f5c1550c25e","Type":"ContainerStarted","Data":"b358268f857176329a4c480668500e41b47c2ec3683e10247ec308a313ca5d5e"} Oct 02 13:12:59 crc kubenswrapper[4724]: I1002 13:12:59.160557 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dm9x4" event={"ID":"63560a5f-9ae8-4fb0-8f06-4f5c1550c25e","Type":"ContainerStarted","Data":"7e93c2cfcb84bb97c5025d2a58a4b44cd9cfdba799ae4a883557fbf257a56818"} Oct 02 13:12:59 crc kubenswrapper[4724]: I1002 13:12:59.715632 4724 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/infra-operator-controller-manager-75d774d5cf-hbbvj" Oct 02 13:13:00 crc kubenswrapper[4724]: I1002 13:13:00.170753 4724 generic.go:334] "Generic (PLEG): container finished" podID="63560a5f-9ae8-4fb0-8f06-4f5c1550c25e" containerID="7e93c2cfcb84bb97c5025d2a58a4b44cd9cfdba799ae4a883557fbf257a56818" exitCode=0 Oct 02 13:13:00 crc kubenswrapper[4724]: I1002 13:13:00.170809 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dm9x4" event={"ID":"63560a5f-9ae8-4fb0-8f06-4f5c1550c25e","Type":"ContainerDied","Data":"7e93c2cfcb84bb97c5025d2a58a4b44cd9cfdba799ae4a883557fbf257a56818"} Oct 02 13:13:03 crc kubenswrapper[4724]: I1002 13:13:03.045263 4724 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-zrj9q"] Oct 02 13:13:03 crc kubenswrapper[4724]: I1002 13:13:03.051563 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-zrj9q" Oct 02 13:13:03 crc kubenswrapper[4724]: I1002 13:13:03.058410 4724 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-zrj9q"] Oct 02 13:13:03 crc kubenswrapper[4724]: I1002 13:13:03.168771 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/abea5a78-a6fc-4a27-9595-c0ef2bbdcca9-utilities\") pod \"redhat-marketplace-zrj9q\" (UID: \"abea5a78-a6fc-4a27-9595-c0ef2bbdcca9\") " pod="openshift-marketplace/redhat-marketplace-zrj9q" Oct 02 13:13:03 crc kubenswrapper[4724]: I1002 13:13:03.168828 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/abea5a78-a6fc-4a27-9595-c0ef2bbdcca9-catalog-content\") pod \"redhat-marketplace-zrj9q\" (UID: \"abea5a78-a6fc-4a27-9595-c0ef2bbdcca9\") " pod="openshift-marketplace/redhat-marketplace-zrj9q" Oct 02 13:13:03 crc kubenswrapper[4724]: I1002 13:13:03.168900 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jkhzx\" (UniqueName: \"kubernetes.io/projected/abea5a78-a6fc-4a27-9595-c0ef2bbdcca9-kube-api-access-jkhzx\") pod \"redhat-marketplace-zrj9q\" (UID: \"abea5a78-a6fc-4a27-9595-c0ef2bbdcca9\") " pod="openshift-marketplace/redhat-marketplace-zrj9q" Oct 02 13:13:03 crc kubenswrapper[4724]: I1002 13:13:03.270231 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/abea5a78-a6fc-4a27-9595-c0ef2bbdcca9-utilities\") pod \"redhat-marketplace-zrj9q\" (UID: \"abea5a78-a6fc-4a27-9595-c0ef2bbdcca9\") " pod="openshift-marketplace/redhat-marketplace-zrj9q" Oct 02 13:13:03 crc kubenswrapper[4724]: I1002 13:13:03.270355 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/abea5a78-a6fc-4a27-9595-c0ef2bbdcca9-catalog-content\") pod \"redhat-marketplace-zrj9q\" (UID: \"abea5a78-a6fc-4a27-9595-c0ef2bbdcca9\") " pod="openshift-marketplace/redhat-marketplace-zrj9q" Oct 02 13:13:03 crc kubenswrapper[4724]: I1002 13:13:03.270470 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jkhzx\" (UniqueName: \"kubernetes.io/projected/abea5a78-a6fc-4a27-9595-c0ef2bbdcca9-kube-api-access-jkhzx\") pod \"redhat-marketplace-zrj9q\" (UID: \"abea5a78-a6fc-4a27-9595-c0ef2bbdcca9\") " pod="openshift-marketplace/redhat-marketplace-zrj9q" Oct 02 13:13:03 crc kubenswrapper[4724]: I1002 13:13:03.270684 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/abea5a78-a6fc-4a27-9595-c0ef2bbdcca9-utilities\") pod \"redhat-marketplace-zrj9q\" (UID: \"abea5a78-a6fc-4a27-9595-c0ef2bbdcca9\") " pod="openshift-marketplace/redhat-marketplace-zrj9q" Oct 02 13:13:03 crc kubenswrapper[4724]: I1002 13:13:03.270897 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/abea5a78-a6fc-4a27-9595-c0ef2bbdcca9-catalog-content\") pod \"redhat-marketplace-zrj9q\" (UID: \"abea5a78-a6fc-4a27-9595-c0ef2bbdcca9\") " pod="openshift-marketplace/redhat-marketplace-zrj9q" Oct 02 13:13:03 crc kubenswrapper[4724]: I1002 13:13:03.299910 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jkhzx\" (UniqueName: \"kubernetes.io/projected/abea5a78-a6fc-4a27-9595-c0ef2bbdcca9-kube-api-access-jkhzx\") pod \"redhat-marketplace-zrj9q\" (UID: \"abea5a78-a6fc-4a27-9595-c0ef2bbdcca9\") " pod="openshift-marketplace/redhat-marketplace-zrj9q" Oct 02 13:13:03 crc kubenswrapper[4724]: I1002 13:13:03.371005 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-zrj9q" Oct 02 13:13:05 crc kubenswrapper[4724]: I1002 13:13:05.123314 4724 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-zrj9q"] Oct 02 13:13:05 crc kubenswrapper[4724]: W1002 13:13:05.127141 4724 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podabea5a78_a6fc_4a27_9595_c0ef2bbdcca9.slice/crio-53679ed16958458e98be784cb0a00e0a87c607d9f226e226af064a73555deb74 WatchSource:0}: Error finding container 53679ed16958458e98be784cb0a00e0a87c607d9f226e226af064a73555deb74: Status 404 returned error can't find the container with id 53679ed16958458e98be784cb0a00e0a87c607d9f226e226af064a73555deb74 Oct 02 13:13:05 crc kubenswrapper[4724]: I1002 13:13:05.203768 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dm9x4" event={"ID":"63560a5f-9ae8-4fb0-8f06-4f5c1550c25e","Type":"ContainerStarted","Data":"66d30c5a63a34cb9c71674fdbef9bf3ffca8d86eeb042428ff973e3cfc90218f"} Oct 02 13:13:05 crc kubenswrapper[4724]: I1002 13:13:05.205073 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zrj9q" event={"ID":"abea5a78-a6fc-4a27-9595-c0ef2bbdcca9","Type":"ContainerStarted","Data":"53679ed16958458e98be784cb0a00e0a87c607d9f226e226af064a73555deb74"} Oct 02 13:13:05 crc kubenswrapper[4724]: I1002 13:13:05.206864 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/openstack-galera-1" event={"ID":"2d30088c-495e-4cd7-892b-f33848b4d5be","Type":"ContainerStarted","Data":"34d4853528c84d77b8e29d3218cfa572f21b6bc6e4503b15557d9f7a7d8f4dcd"} Oct 02 13:13:05 crc kubenswrapper[4724]: I1002 13:13:05.208913 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/openstack-galera-2" event={"ID":"dae47a84-6fb7-42c6-ad5b-415ca57924b3","Type":"ContainerStarted","Data":"d02f8fcf3844b7634bda8f6d743a247c5f0790d6294579e9aef3762c4928ea6b"} Oct 02 13:13:05 crc kubenswrapper[4724]: I1002 13:13:05.210856 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/openstack-galera-0" event={"ID":"fa83e829-3df9-425f-91ba-2271f0c201ab","Type":"ContainerStarted","Data":"7af20e7e7a46aa1079f9c0245f34d1f6cfe916f87ca67155c7a16be16648aaf0"} Oct 02 13:13:05 crc kubenswrapper[4724]: I1002 13:13:05.228998 4724 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-dm9x4" podStartSLOduration=1.556124569 podStartE2EDuration="8.228982862s" podCreationTimestamp="2025-10-02 13:12:57 +0000 UTC" firstStartedPulling="2025-10-02 13:12:58.149366509 +0000 UTC m=+842.604125630" lastFinishedPulling="2025-10-02 13:13:04.822224802 +0000 UTC m=+849.276983923" observedRunningTime="2025-10-02 13:13:05.227584525 +0000 UTC m=+849.682343656" watchObservedRunningTime="2025-10-02 13:13:05.228982862 +0000 UTC m=+849.683741983" Oct 02 13:13:06 crc kubenswrapper[4724]: I1002 13:13:06.218746 4724 generic.go:334] "Generic (PLEG): container finished" podID="abea5a78-a6fc-4a27-9595-c0ef2bbdcca9" containerID="c5c45be577012e6a10109beccdcb1ab3d17b48201678b1c31eeb6240b24665e7" exitCode=0 Oct 02 13:13:06 crc kubenswrapper[4724]: I1002 13:13:06.218820 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zrj9q" event={"ID":"abea5a78-a6fc-4a27-9595-c0ef2bbdcca9","Type":"ContainerDied","Data":"c5c45be577012e6a10109beccdcb1ab3d17b48201678b1c31eeb6240b24665e7"} Oct 02 13:13:07 crc kubenswrapper[4724]: I1002 13:13:07.227643 4724 generic.go:334] "Generic (PLEG): container finished" podID="abea5a78-a6fc-4a27-9595-c0ef2bbdcca9" containerID="36717aea5379077e4416f5833a52f24033aff91bb2dbc71540717ff8aa18f632" exitCode=0 Oct 02 13:13:07 crc kubenswrapper[4724]: I1002 13:13:07.227759 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zrj9q" event={"ID":"abea5a78-a6fc-4a27-9595-c0ef2bbdcca9","Type":"ContainerDied","Data":"36717aea5379077e4416f5833a52f24033aff91bb2dbc71540717ff8aa18f632"} Oct 02 13:13:07 crc kubenswrapper[4724]: I1002 13:13:07.408870 4724 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-dm9x4" Oct 02 13:13:07 crc kubenswrapper[4724]: I1002 13:13:07.408940 4724 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-dm9x4" Oct 02 13:13:08 crc kubenswrapper[4724]: I1002 13:13:08.237148 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zrj9q" event={"ID":"abea5a78-a6fc-4a27-9595-c0ef2bbdcca9","Type":"ContainerStarted","Data":"5e01d9a8e77d7926f2cb3599111c4ddab6c3ac7cd432e48e7e1747d48b37fda8"} Oct 02 13:13:08 crc kubenswrapper[4724]: I1002 13:13:08.501388 4724 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/certified-operators-dm9x4" podUID="63560a5f-9ae8-4fb0-8f06-4f5c1550c25e" containerName="registry-server" probeResult="failure" output=< Oct 02 13:13:08 crc kubenswrapper[4724]: timeout: failed to connect service ":50051" within 1s Oct 02 13:13:08 crc kubenswrapper[4724]: > Oct 02 13:13:10 crc kubenswrapper[4724]: I1002 13:13:10.248167 4724 generic.go:334] "Generic (PLEG): container finished" podID="2d30088c-495e-4cd7-892b-f33848b4d5be" containerID="34d4853528c84d77b8e29d3218cfa572f21b6bc6e4503b15557d9f7a7d8f4dcd" exitCode=0 Oct 02 13:13:10 crc kubenswrapper[4724]: I1002 13:13:10.248249 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/openstack-galera-1" event={"ID":"2d30088c-495e-4cd7-892b-f33848b4d5be","Type":"ContainerDied","Data":"34d4853528c84d77b8e29d3218cfa572f21b6bc6e4503b15557d9f7a7d8f4dcd"} Oct 02 13:13:10 crc kubenswrapper[4724]: I1002 13:13:10.252687 4724 generic.go:334] "Generic (PLEG): container finished" podID="dae47a84-6fb7-42c6-ad5b-415ca57924b3" containerID="d02f8fcf3844b7634bda8f6d743a247c5f0790d6294579e9aef3762c4928ea6b" exitCode=0 Oct 02 13:13:10 crc kubenswrapper[4724]: I1002 13:13:10.252782 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/openstack-galera-2" event={"ID":"dae47a84-6fb7-42c6-ad5b-415ca57924b3","Type":"ContainerDied","Data":"d02f8fcf3844b7634bda8f6d743a247c5f0790d6294579e9aef3762c4928ea6b"} Oct 02 13:13:10 crc kubenswrapper[4724]: I1002 13:13:10.255768 4724 generic.go:334] "Generic (PLEG): container finished" podID="fa83e829-3df9-425f-91ba-2271f0c201ab" containerID="7af20e7e7a46aa1079f9c0245f34d1f6cfe916f87ca67155c7a16be16648aaf0" exitCode=0 Oct 02 13:13:10 crc kubenswrapper[4724]: I1002 13:13:10.255807 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/openstack-galera-0" event={"ID":"fa83e829-3df9-425f-91ba-2271f0c201ab","Type":"ContainerDied","Data":"7af20e7e7a46aa1079f9c0245f34d1f6cfe916f87ca67155c7a16be16648aaf0"} Oct 02 13:13:10 crc kubenswrapper[4724]: I1002 13:13:10.279684 4724 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-zrj9q" podStartSLOduration=5.757227842 podStartE2EDuration="7.279659991s" podCreationTimestamp="2025-10-02 13:13:03 +0000 UTC" firstStartedPulling="2025-10-02 13:13:06.221020376 +0000 UTC m=+850.675779497" lastFinishedPulling="2025-10-02 13:13:07.743452525 +0000 UTC m=+852.198211646" observedRunningTime="2025-10-02 13:13:08.257227541 +0000 UTC m=+852.711986662" watchObservedRunningTime="2025-10-02 13:13:10.279659991 +0000 UTC m=+854.734419132" Oct 02 13:13:11 crc kubenswrapper[4724]: I1002 13:13:11.042163 4724 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-index-jxtww"] Oct 02 13:13:11 crc kubenswrapper[4724]: I1002 13:13:11.043237 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-index-jxtww" Oct 02 13:13:11 crc kubenswrapper[4724]: I1002 13:13:11.048680 4724 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"rabbitmq-cluster-operator-index-dockercfg-d8ppj" Oct 02 13:13:11 crc kubenswrapper[4724]: I1002 13:13:11.057412 4724 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-index-jxtww"] Oct 02 13:13:11 crc kubenswrapper[4724]: I1002 13:13:11.100259 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jg9fl\" (UniqueName: \"kubernetes.io/projected/962102f1-5774-4ad9-988f-c1cd36d66caa-kube-api-access-jg9fl\") pod \"rabbitmq-cluster-operator-index-jxtww\" (UID: \"962102f1-5774-4ad9-988f-c1cd36d66caa\") " pod="openstack-operators/rabbitmq-cluster-operator-index-jxtww" Oct 02 13:13:11 crc kubenswrapper[4724]: I1002 13:13:11.201848 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jg9fl\" (UniqueName: \"kubernetes.io/projected/962102f1-5774-4ad9-988f-c1cd36d66caa-kube-api-access-jg9fl\") pod \"rabbitmq-cluster-operator-index-jxtww\" (UID: \"962102f1-5774-4ad9-988f-c1cd36d66caa\") " pod="openstack-operators/rabbitmq-cluster-operator-index-jxtww" Oct 02 13:13:11 crc kubenswrapper[4724]: I1002 13:13:11.221207 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jg9fl\" (UniqueName: \"kubernetes.io/projected/962102f1-5774-4ad9-988f-c1cd36d66caa-kube-api-access-jg9fl\") pod \"rabbitmq-cluster-operator-index-jxtww\" (UID: \"962102f1-5774-4ad9-988f-c1cd36d66caa\") " pod="openstack-operators/rabbitmq-cluster-operator-index-jxtww" Oct 02 13:13:11 crc kubenswrapper[4724]: I1002 13:13:11.262553 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/openstack-galera-2" event={"ID":"dae47a84-6fb7-42c6-ad5b-415ca57924b3","Type":"ContainerStarted","Data":"4f5c9cf5a2e55b7523b7d768e96ab5099bf5e50fa50599fb37e87a1a1c5a8dbd"} Oct 02 13:13:11 crc kubenswrapper[4724]: I1002 13:13:11.264311 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/openstack-galera-0" event={"ID":"fa83e829-3df9-425f-91ba-2271f0c201ab","Type":"ContainerStarted","Data":"9754696dd846bf9d81903eb8f29f557e9f572e3730a98d31f71c9e98b0b1b0f8"} Oct 02 13:13:11 crc kubenswrapper[4724]: I1002 13:13:11.266063 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/openstack-galera-1" event={"ID":"2d30088c-495e-4cd7-892b-f33848b4d5be","Type":"ContainerStarted","Data":"b65cdef2ae23efde3dc53e0f3afe26869e0d2ba1426130449ed658f7f549879e"} Oct 02 13:13:11 crc kubenswrapper[4724]: I1002 13:13:11.284771 4724 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="glance-kuttl-tests/openstack-galera-2" podStartSLOduration=9.35282793 podStartE2EDuration="17.28475075s" podCreationTimestamp="2025-10-02 13:12:54 +0000 UTC" firstStartedPulling="2025-10-02 13:12:56.588288887 +0000 UTC m=+841.043047998" lastFinishedPulling="2025-10-02 13:13:04.520211697 +0000 UTC m=+848.974970818" observedRunningTime="2025-10-02 13:13:11.279574544 +0000 UTC m=+855.734333675" watchObservedRunningTime="2025-10-02 13:13:11.28475075 +0000 UTC m=+855.739509871" Oct 02 13:13:11 crc kubenswrapper[4724]: I1002 13:13:11.303394 4724 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="glance-kuttl-tests/openstack-galera-1" podStartSLOduration=9.337664717 podStartE2EDuration="17.303378032s" podCreationTimestamp="2025-10-02 13:12:54 +0000 UTC" firstStartedPulling="2025-10-02 13:12:56.655263383 +0000 UTC m=+841.110022504" lastFinishedPulling="2025-10-02 13:13:04.620976698 +0000 UTC m=+849.075735819" observedRunningTime="2025-10-02 13:13:11.299586312 +0000 UTC m=+855.754345453" watchObservedRunningTime="2025-10-02 13:13:11.303378032 +0000 UTC m=+855.758137153" Oct 02 13:13:11 crc kubenswrapper[4724]: I1002 13:13:11.365861 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-index-jxtww" Oct 02 13:13:11 crc kubenswrapper[4724]: I1002 13:13:11.800688 4724 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="glance-kuttl-tests/openstack-galera-0" podStartSLOduration=9.846688099 podStartE2EDuration="17.800671483s" podCreationTimestamp="2025-10-02 13:12:54 +0000 UTC" firstStartedPulling="2025-10-02 13:12:56.593843882 +0000 UTC m=+841.048603003" lastFinishedPulling="2025-10-02 13:13:04.547827266 +0000 UTC m=+849.002586387" observedRunningTime="2025-10-02 13:13:11.32263043 +0000 UTC m=+855.777389561" watchObservedRunningTime="2025-10-02 13:13:11.800671483 +0000 UTC m=+856.255430604" Oct 02 13:13:11 crc kubenswrapper[4724]: I1002 13:13:11.803390 4724 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-index-jxtww"] Oct 02 13:13:11 crc kubenswrapper[4724]: W1002 13:13:11.809868 4724 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod962102f1_5774_4ad9_988f_c1cd36d66caa.slice/crio-67dc0daef177ddaebc903d247a8ad1a3115d4ff47fb3c39d7ef9ea1696bb40f7 WatchSource:0}: Error finding container 67dc0daef177ddaebc903d247a8ad1a3115d4ff47fb3c39d7ef9ea1696bb40f7: Status 404 returned error can't find the container with id 67dc0daef177ddaebc903d247a8ad1a3115d4ff47fb3c39d7ef9ea1696bb40f7 Oct 02 13:13:12 crc kubenswrapper[4724]: I1002 13:13:12.275612 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-index-jxtww" event={"ID":"962102f1-5774-4ad9-988f-c1cd36d66caa","Type":"ContainerStarted","Data":"67dc0daef177ddaebc903d247a8ad1a3115d4ff47fb3c39d7ef9ea1696bb40f7"} Oct 02 13:13:13 crc kubenswrapper[4724]: I1002 13:13:13.371602 4724 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-zrj9q" Oct 02 13:13:13 crc kubenswrapper[4724]: I1002 13:13:13.371968 4724 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-zrj9q" Oct 02 13:13:13 crc kubenswrapper[4724]: I1002 13:13:13.425494 4724 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-zrj9q" Oct 02 13:13:14 crc kubenswrapper[4724]: I1002 13:13:14.332370 4724 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-zrj9q" Oct 02 13:13:15 crc kubenswrapper[4724]: I1002 13:13:15.018768 4724 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/memcached-0"] Oct 02 13:13:15 crc kubenswrapper[4724]: I1002 13:13:15.020238 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/memcached-0" Oct 02 13:13:15 crc kubenswrapper[4724]: I1002 13:13:15.026165 4724 reflector.go:368] Caches populated for *v1.ConfigMap from object-"glance-kuttl-tests"/"memcached-config-data" Oct 02 13:13:15 crc kubenswrapper[4724]: I1002 13:13:15.031798 4724 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"memcached-memcached-dockercfg-xn8wt" Oct 02 13:13:15 crc kubenswrapper[4724]: I1002 13:13:15.037225 4724 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/memcached-0"] Oct 02 13:13:15 crc kubenswrapper[4724]: I1002 13:13:15.057245 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5mfck\" (UniqueName: \"kubernetes.io/projected/d97fb367-636d-4a88-ae9f-eca3e33182ac-kube-api-access-5mfck\") pod \"memcached-0\" (UID: \"d97fb367-636d-4a88-ae9f-eca3e33182ac\") " pod="glance-kuttl-tests/memcached-0" Oct 02 13:13:15 crc kubenswrapper[4724]: I1002 13:13:15.057323 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d97fb367-636d-4a88-ae9f-eca3e33182ac-config-data\") pod \"memcached-0\" (UID: \"d97fb367-636d-4a88-ae9f-eca3e33182ac\") " pod="glance-kuttl-tests/memcached-0" Oct 02 13:13:15 crc kubenswrapper[4724]: I1002 13:13:15.057385 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/d97fb367-636d-4a88-ae9f-eca3e33182ac-kolla-config\") pod \"memcached-0\" (UID: \"d97fb367-636d-4a88-ae9f-eca3e33182ac\") " pod="glance-kuttl-tests/memcached-0" Oct 02 13:13:15 crc kubenswrapper[4724]: I1002 13:13:15.159023 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5mfck\" (UniqueName: \"kubernetes.io/projected/d97fb367-636d-4a88-ae9f-eca3e33182ac-kube-api-access-5mfck\") pod \"memcached-0\" (UID: \"d97fb367-636d-4a88-ae9f-eca3e33182ac\") " pod="glance-kuttl-tests/memcached-0" Oct 02 13:13:15 crc kubenswrapper[4724]: I1002 13:13:15.159422 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d97fb367-636d-4a88-ae9f-eca3e33182ac-config-data\") pod \"memcached-0\" (UID: \"d97fb367-636d-4a88-ae9f-eca3e33182ac\") " pod="glance-kuttl-tests/memcached-0" Oct 02 13:13:15 crc kubenswrapper[4724]: I1002 13:13:15.159492 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/d97fb367-636d-4a88-ae9f-eca3e33182ac-kolla-config\") pod \"memcached-0\" (UID: \"d97fb367-636d-4a88-ae9f-eca3e33182ac\") " pod="glance-kuttl-tests/memcached-0" Oct 02 13:13:15 crc kubenswrapper[4724]: I1002 13:13:15.160729 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d97fb367-636d-4a88-ae9f-eca3e33182ac-config-data\") pod \"memcached-0\" (UID: \"d97fb367-636d-4a88-ae9f-eca3e33182ac\") " pod="glance-kuttl-tests/memcached-0" Oct 02 13:13:15 crc kubenswrapper[4724]: I1002 13:13:15.160816 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/d97fb367-636d-4a88-ae9f-eca3e33182ac-kolla-config\") pod \"memcached-0\" (UID: \"d97fb367-636d-4a88-ae9f-eca3e33182ac\") " pod="glance-kuttl-tests/memcached-0" Oct 02 13:13:15 crc kubenswrapper[4724]: I1002 13:13:15.184271 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5mfck\" (UniqueName: \"kubernetes.io/projected/d97fb367-636d-4a88-ae9f-eca3e33182ac-kube-api-access-5mfck\") pod \"memcached-0\" (UID: \"d97fb367-636d-4a88-ae9f-eca3e33182ac\") " pod="glance-kuttl-tests/memcached-0" Oct 02 13:13:15 crc kubenswrapper[4724]: I1002 13:13:15.359818 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/memcached-0" Oct 02 13:13:16 crc kubenswrapper[4724]: I1002 13:13:16.170698 4724 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="glance-kuttl-tests/openstack-galera-0" Oct 02 13:13:16 crc kubenswrapper[4724]: I1002 13:13:16.175098 4724 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/openstack-galera-0" Oct 02 13:13:16 crc kubenswrapper[4724]: I1002 13:13:16.182451 4724 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="glance-kuttl-tests/openstack-galera-2" Oct 02 13:13:16 crc kubenswrapper[4724]: I1002 13:13:16.182816 4724 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/openstack-galera-2" Oct 02 13:13:16 crc kubenswrapper[4724]: I1002 13:13:16.189806 4724 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="glance-kuttl-tests/openstack-galera-1" Oct 02 13:13:16 crc kubenswrapper[4724]: I1002 13:13:16.190664 4724 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/openstack-galera-1" Oct 02 13:13:16 crc kubenswrapper[4724]: I1002 13:13:16.392760 4724 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/memcached-0"] Oct 02 13:13:16 crc kubenswrapper[4724]: E1002 13:13:16.444053 4724 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.102.83.233:40124->38.102.83.233:33487: write tcp 38.102.83.233:40124->38.102.83.233:33487: write: broken pipe Oct 02 13:13:17 crc kubenswrapper[4724]: I1002 13:13:17.306511 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/memcached-0" event={"ID":"d97fb367-636d-4a88-ae9f-eca3e33182ac","Type":"ContainerStarted","Data":"1254d1a16acc6a66db2f0a99fa8deee7e372ec296bf2bf7e9278c01018534b92"} Oct 02 13:13:17 crc kubenswrapper[4724]: I1002 13:13:17.465800 4724 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-dm9x4" Oct 02 13:13:17 crc kubenswrapper[4724]: I1002 13:13:17.514693 4724 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-dm9x4" Oct 02 13:13:18 crc kubenswrapper[4724]: I1002 13:13:18.321153 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-index-jxtww" event={"ID":"962102f1-5774-4ad9-988f-c1cd36d66caa","Type":"ContainerStarted","Data":"8325eb7c397f7785cd828bfe78a0ad765c4e2e349753bd9fa93f3448194ca587"} Oct 02 13:13:18 crc kubenswrapper[4724]: I1002 13:13:18.333477 4724 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/rabbitmq-cluster-operator-index-jxtww" podStartSLOduration=1.889474012 podStartE2EDuration="7.333453938s" podCreationTimestamp="2025-10-02 13:13:11 +0000 UTC" firstStartedPulling="2025-10-02 13:13:11.812044933 +0000 UTC m=+856.266804054" lastFinishedPulling="2025-10-02 13:13:17.256024859 +0000 UTC m=+861.710783980" observedRunningTime="2025-10-02 13:13:18.32857885 +0000 UTC m=+862.783337971" watchObservedRunningTime="2025-10-02 13:13:18.333453938 +0000 UTC m=+862.788213059" Oct 02 13:13:18 crc kubenswrapper[4724]: I1002 13:13:18.834692 4724 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-zrj9q"] Oct 02 13:13:18 crc kubenswrapper[4724]: I1002 13:13:18.835233 4724 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-zrj9q" podUID="abea5a78-a6fc-4a27-9595-c0ef2bbdcca9" containerName="registry-server" containerID="cri-o://5e01d9a8e77d7926f2cb3599111c4ddab6c3ac7cd432e48e7e1747d48b37fda8" gracePeriod=2 Oct 02 13:13:19 crc kubenswrapper[4724]: I1002 13:13:19.269628 4724 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-zrj9q" Oct 02 13:13:19 crc kubenswrapper[4724]: I1002 13:13:19.321515 4724 generic.go:334] "Generic (PLEG): container finished" podID="abea5a78-a6fc-4a27-9595-c0ef2bbdcca9" containerID="5e01d9a8e77d7926f2cb3599111c4ddab6c3ac7cd432e48e7e1747d48b37fda8" exitCode=0 Oct 02 13:13:19 crc kubenswrapper[4724]: I1002 13:13:19.321590 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zrj9q" event={"ID":"abea5a78-a6fc-4a27-9595-c0ef2bbdcca9","Type":"ContainerDied","Data":"5e01d9a8e77d7926f2cb3599111c4ddab6c3ac7cd432e48e7e1747d48b37fda8"} Oct 02 13:13:19 crc kubenswrapper[4724]: I1002 13:13:19.321616 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zrj9q" event={"ID":"abea5a78-a6fc-4a27-9595-c0ef2bbdcca9","Type":"ContainerDied","Data":"53679ed16958458e98be784cb0a00e0a87c607d9f226e226af064a73555deb74"} Oct 02 13:13:19 crc kubenswrapper[4724]: I1002 13:13:19.321633 4724 scope.go:117] "RemoveContainer" containerID="5e01d9a8e77d7926f2cb3599111c4ddab6c3ac7cd432e48e7e1747d48b37fda8" Oct 02 13:13:19 crc kubenswrapper[4724]: I1002 13:13:19.321739 4724 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-zrj9q" Oct 02 13:13:19 crc kubenswrapper[4724]: I1002 13:13:19.331054 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/memcached-0" event={"ID":"d97fb367-636d-4a88-ae9f-eca3e33182ac","Type":"ContainerStarted","Data":"64eb6da337d5c414a5ec12b2c8e0c5c330e43051aa485c9f5fcc453d39de2888"} Oct 02 13:13:19 crc kubenswrapper[4724]: I1002 13:13:19.331179 4724 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/memcached-0" Oct 02 13:13:19 crc kubenswrapper[4724]: I1002 13:13:19.348817 4724 scope.go:117] "RemoveContainer" containerID="36717aea5379077e4416f5833a52f24033aff91bb2dbc71540717ff8aa18f632" Oct 02 13:13:19 crc kubenswrapper[4724]: I1002 13:13:19.364268 4724 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="glance-kuttl-tests/memcached-0" podStartSLOduration=3.658538207 podStartE2EDuration="5.364244876s" podCreationTimestamp="2025-10-02 13:13:14 +0000 UTC" firstStartedPulling="2025-10-02 13:13:16.70880976 +0000 UTC m=+861.163568881" lastFinishedPulling="2025-10-02 13:13:18.414516429 +0000 UTC m=+862.869275550" observedRunningTime="2025-10-02 13:13:19.351656144 +0000 UTC m=+863.806415275" watchObservedRunningTime="2025-10-02 13:13:19.364244876 +0000 UTC m=+863.819003997" Oct 02 13:13:19 crc kubenswrapper[4724]: I1002 13:13:19.368093 4724 scope.go:117] "RemoveContainer" containerID="c5c45be577012e6a10109beccdcb1ab3d17b48201678b1c31eeb6240b24665e7" Oct 02 13:13:19 crc kubenswrapper[4724]: I1002 13:13:19.392082 4724 scope.go:117] "RemoveContainer" containerID="5e01d9a8e77d7926f2cb3599111c4ddab6c3ac7cd432e48e7e1747d48b37fda8" Oct 02 13:13:19 crc kubenswrapper[4724]: E1002 13:13:19.392571 4724 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5e01d9a8e77d7926f2cb3599111c4ddab6c3ac7cd432e48e7e1747d48b37fda8\": container with ID starting with 5e01d9a8e77d7926f2cb3599111c4ddab6c3ac7cd432e48e7e1747d48b37fda8 not found: ID does not exist" containerID="5e01d9a8e77d7926f2cb3599111c4ddab6c3ac7cd432e48e7e1747d48b37fda8" Oct 02 13:13:19 crc kubenswrapper[4724]: I1002 13:13:19.392626 4724 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5e01d9a8e77d7926f2cb3599111c4ddab6c3ac7cd432e48e7e1747d48b37fda8"} err="failed to get container status \"5e01d9a8e77d7926f2cb3599111c4ddab6c3ac7cd432e48e7e1747d48b37fda8\": rpc error: code = NotFound desc = could not find container \"5e01d9a8e77d7926f2cb3599111c4ddab6c3ac7cd432e48e7e1747d48b37fda8\": container with ID starting with 5e01d9a8e77d7926f2cb3599111c4ddab6c3ac7cd432e48e7e1747d48b37fda8 not found: ID does not exist" Oct 02 13:13:19 crc kubenswrapper[4724]: I1002 13:13:19.392658 4724 scope.go:117] "RemoveContainer" containerID="36717aea5379077e4416f5833a52f24033aff91bb2dbc71540717ff8aa18f632" Oct 02 13:13:19 crc kubenswrapper[4724]: E1002 13:13:19.393177 4724 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"36717aea5379077e4416f5833a52f24033aff91bb2dbc71540717ff8aa18f632\": container with ID starting with 36717aea5379077e4416f5833a52f24033aff91bb2dbc71540717ff8aa18f632 not found: ID does not exist" containerID="36717aea5379077e4416f5833a52f24033aff91bb2dbc71540717ff8aa18f632" Oct 02 13:13:19 crc kubenswrapper[4724]: I1002 13:13:19.393217 4724 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"36717aea5379077e4416f5833a52f24033aff91bb2dbc71540717ff8aa18f632"} err="failed to get container status \"36717aea5379077e4416f5833a52f24033aff91bb2dbc71540717ff8aa18f632\": rpc error: code = NotFound desc = could not find container \"36717aea5379077e4416f5833a52f24033aff91bb2dbc71540717ff8aa18f632\": container with ID starting with 36717aea5379077e4416f5833a52f24033aff91bb2dbc71540717ff8aa18f632 not found: ID does not exist" Oct 02 13:13:19 crc kubenswrapper[4724]: I1002 13:13:19.393241 4724 scope.go:117] "RemoveContainer" containerID="c5c45be577012e6a10109beccdcb1ab3d17b48201678b1c31eeb6240b24665e7" Oct 02 13:13:19 crc kubenswrapper[4724]: E1002 13:13:19.393570 4724 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c5c45be577012e6a10109beccdcb1ab3d17b48201678b1c31eeb6240b24665e7\": container with ID starting with c5c45be577012e6a10109beccdcb1ab3d17b48201678b1c31eeb6240b24665e7 not found: ID does not exist" containerID="c5c45be577012e6a10109beccdcb1ab3d17b48201678b1c31eeb6240b24665e7" Oct 02 13:13:19 crc kubenswrapper[4724]: I1002 13:13:19.393593 4724 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c5c45be577012e6a10109beccdcb1ab3d17b48201678b1c31eeb6240b24665e7"} err="failed to get container status \"c5c45be577012e6a10109beccdcb1ab3d17b48201678b1c31eeb6240b24665e7\": rpc error: code = NotFound desc = could not find container \"c5c45be577012e6a10109beccdcb1ab3d17b48201678b1c31eeb6240b24665e7\": container with ID starting with c5c45be577012e6a10109beccdcb1ab3d17b48201678b1c31eeb6240b24665e7 not found: ID does not exist" Oct 02 13:13:19 crc kubenswrapper[4724]: I1002 13:13:19.418129 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/abea5a78-a6fc-4a27-9595-c0ef2bbdcca9-catalog-content\") pod \"abea5a78-a6fc-4a27-9595-c0ef2bbdcca9\" (UID: \"abea5a78-a6fc-4a27-9595-c0ef2bbdcca9\") " Oct 02 13:13:19 crc kubenswrapper[4724]: I1002 13:13:19.418249 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/abea5a78-a6fc-4a27-9595-c0ef2bbdcca9-utilities\") pod \"abea5a78-a6fc-4a27-9595-c0ef2bbdcca9\" (UID: \"abea5a78-a6fc-4a27-9595-c0ef2bbdcca9\") " Oct 02 13:13:19 crc kubenswrapper[4724]: I1002 13:13:19.418328 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jkhzx\" (UniqueName: \"kubernetes.io/projected/abea5a78-a6fc-4a27-9595-c0ef2bbdcca9-kube-api-access-jkhzx\") pod \"abea5a78-a6fc-4a27-9595-c0ef2bbdcca9\" (UID: \"abea5a78-a6fc-4a27-9595-c0ef2bbdcca9\") " Oct 02 13:13:19 crc kubenswrapper[4724]: I1002 13:13:19.420702 4724 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/abea5a78-a6fc-4a27-9595-c0ef2bbdcca9-utilities" (OuterVolumeSpecName: "utilities") pod "abea5a78-a6fc-4a27-9595-c0ef2bbdcca9" (UID: "abea5a78-a6fc-4a27-9595-c0ef2bbdcca9"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 13:13:19 crc kubenswrapper[4724]: I1002 13:13:19.426284 4724 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/abea5a78-a6fc-4a27-9595-c0ef2bbdcca9-kube-api-access-jkhzx" (OuterVolumeSpecName: "kube-api-access-jkhzx") pod "abea5a78-a6fc-4a27-9595-c0ef2bbdcca9" (UID: "abea5a78-a6fc-4a27-9595-c0ef2bbdcca9"). InnerVolumeSpecName "kube-api-access-jkhzx". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 13:13:19 crc kubenswrapper[4724]: I1002 13:13:19.431903 4724 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/abea5a78-a6fc-4a27-9595-c0ef2bbdcca9-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "abea5a78-a6fc-4a27-9595-c0ef2bbdcca9" (UID: "abea5a78-a6fc-4a27-9595-c0ef2bbdcca9"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 13:13:19 crc kubenswrapper[4724]: I1002 13:13:19.519745 4724 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jkhzx\" (UniqueName: \"kubernetes.io/projected/abea5a78-a6fc-4a27-9595-c0ef2bbdcca9-kube-api-access-jkhzx\") on node \"crc\" DevicePath \"\"" Oct 02 13:13:19 crc kubenswrapper[4724]: I1002 13:13:19.519797 4724 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/abea5a78-a6fc-4a27-9595-c0ef2bbdcca9-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 02 13:13:19 crc kubenswrapper[4724]: I1002 13:13:19.519810 4724 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/abea5a78-a6fc-4a27-9595-c0ef2bbdcca9-utilities\") on node \"crc\" DevicePath \"\"" Oct 02 13:13:19 crc kubenswrapper[4724]: I1002 13:13:19.646585 4724 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-zrj9q"] Oct 02 13:13:19 crc kubenswrapper[4724]: I1002 13:13:19.650470 4724 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-zrj9q"] Oct 02 13:13:20 crc kubenswrapper[4724]: I1002 13:13:20.245428 4724 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="glance-kuttl-tests/openstack-galera-2" Oct 02 13:13:20 crc kubenswrapper[4724]: I1002 13:13:20.284876 4724 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/openstack-galera-2" Oct 02 13:13:20 crc kubenswrapper[4724]: I1002 13:13:20.323601 4724 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="abea5a78-a6fc-4a27-9595-c0ef2bbdcca9" path="/var/lib/kubelet/pods/abea5a78-a6fc-4a27-9595-c0ef2bbdcca9/volumes" Oct 02 13:13:21 crc kubenswrapper[4724]: I1002 13:13:21.366084 4724 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/rabbitmq-cluster-operator-index-jxtww" Oct 02 13:13:21 crc kubenswrapper[4724]: I1002 13:13:21.366129 4724 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/rabbitmq-cluster-operator-index-jxtww" Oct 02 13:13:21 crc kubenswrapper[4724]: I1002 13:13:21.395694 4724 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/rabbitmq-cluster-operator-index-jxtww" Oct 02 13:13:22 crc kubenswrapper[4724]: I1002 13:13:22.233903 4724 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-dm9x4"] Oct 02 13:13:22 crc kubenswrapper[4724]: I1002 13:13:22.234168 4724 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-dm9x4" podUID="63560a5f-9ae8-4fb0-8f06-4f5c1550c25e" containerName="registry-server" containerID="cri-o://66d30c5a63a34cb9c71674fdbef9bf3ffca8d86eeb042428ff973e3cfc90218f" gracePeriod=2 Oct 02 13:13:22 crc kubenswrapper[4724]: I1002 13:13:22.354404 4724 generic.go:334] "Generic (PLEG): container finished" podID="63560a5f-9ae8-4fb0-8f06-4f5c1550c25e" containerID="66d30c5a63a34cb9c71674fdbef9bf3ffca8d86eeb042428ff973e3cfc90218f" exitCode=0 Oct 02 13:13:22 crc kubenswrapper[4724]: I1002 13:13:22.354480 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dm9x4" event={"ID":"63560a5f-9ae8-4fb0-8f06-4f5c1550c25e","Type":"ContainerDied","Data":"66d30c5a63a34cb9c71674fdbef9bf3ffca8d86eeb042428ff973e3cfc90218f"} Oct 02 13:13:22 crc kubenswrapper[4724]: I1002 13:13:22.387081 4724 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/rabbitmq-cluster-operator-index-jxtww" Oct 02 13:13:22 crc kubenswrapper[4724]: I1002 13:13:22.644495 4724 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-dm9x4" Oct 02 13:13:22 crc kubenswrapper[4724]: I1002 13:13:22.771656 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/63560a5f-9ae8-4fb0-8f06-4f5c1550c25e-catalog-content\") pod \"63560a5f-9ae8-4fb0-8f06-4f5c1550c25e\" (UID: \"63560a5f-9ae8-4fb0-8f06-4f5c1550c25e\") " Oct 02 13:13:22 crc kubenswrapper[4724]: I1002 13:13:22.771944 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hvdf7\" (UniqueName: \"kubernetes.io/projected/63560a5f-9ae8-4fb0-8f06-4f5c1550c25e-kube-api-access-hvdf7\") pod \"63560a5f-9ae8-4fb0-8f06-4f5c1550c25e\" (UID: \"63560a5f-9ae8-4fb0-8f06-4f5c1550c25e\") " Oct 02 13:13:22 crc kubenswrapper[4724]: I1002 13:13:22.771995 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/63560a5f-9ae8-4fb0-8f06-4f5c1550c25e-utilities\") pod \"63560a5f-9ae8-4fb0-8f06-4f5c1550c25e\" (UID: \"63560a5f-9ae8-4fb0-8f06-4f5c1550c25e\") " Oct 02 13:13:22 crc kubenswrapper[4724]: I1002 13:13:22.773121 4724 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/63560a5f-9ae8-4fb0-8f06-4f5c1550c25e-utilities" (OuterVolumeSpecName: "utilities") pod "63560a5f-9ae8-4fb0-8f06-4f5c1550c25e" (UID: "63560a5f-9ae8-4fb0-8f06-4f5c1550c25e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 13:13:22 crc kubenswrapper[4724]: I1002 13:13:22.778689 4724 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/63560a5f-9ae8-4fb0-8f06-4f5c1550c25e-kube-api-access-hvdf7" (OuterVolumeSpecName: "kube-api-access-hvdf7") pod "63560a5f-9ae8-4fb0-8f06-4f5c1550c25e" (UID: "63560a5f-9ae8-4fb0-8f06-4f5c1550c25e"). InnerVolumeSpecName "kube-api-access-hvdf7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 13:13:22 crc kubenswrapper[4724]: I1002 13:13:22.822485 4724 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/63560a5f-9ae8-4fb0-8f06-4f5c1550c25e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "63560a5f-9ae8-4fb0-8f06-4f5c1550c25e" (UID: "63560a5f-9ae8-4fb0-8f06-4f5c1550c25e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 13:13:22 crc kubenswrapper[4724]: I1002 13:13:22.873383 4724 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hvdf7\" (UniqueName: \"kubernetes.io/projected/63560a5f-9ae8-4fb0-8f06-4f5c1550c25e-kube-api-access-hvdf7\") on node \"crc\" DevicePath \"\"" Oct 02 13:13:22 crc kubenswrapper[4724]: I1002 13:13:22.873655 4724 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/63560a5f-9ae8-4fb0-8f06-4f5c1550c25e-utilities\") on node \"crc\" DevicePath \"\"" Oct 02 13:13:22 crc kubenswrapper[4724]: I1002 13:13:22.873737 4724 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/63560a5f-9ae8-4fb0-8f06-4f5c1550c25e-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 02 13:13:23 crc kubenswrapper[4724]: I1002 13:13:23.363126 4724 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-dm9x4" Oct 02 13:13:23 crc kubenswrapper[4724]: I1002 13:13:23.363445 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dm9x4" event={"ID":"63560a5f-9ae8-4fb0-8f06-4f5c1550c25e","Type":"ContainerDied","Data":"b358268f857176329a4c480668500e41b47c2ec3683e10247ec308a313ca5d5e"} Oct 02 13:13:23 crc kubenswrapper[4724]: I1002 13:13:23.363723 4724 scope.go:117] "RemoveContainer" containerID="66d30c5a63a34cb9c71674fdbef9bf3ffca8d86eeb042428ff973e3cfc90218f" Oct 02 13:13:23 crc kubenswrapper[4724]: I1002 13:13:23.381954 4724 scope.go:117] "RemoveContainer" containerID="7e93c2cfcb84bb97c5025d2a58a4b44cd9cfdba799ae4a883557fbf257a56818" Oct 02 13:13:23 crc kubenswrapper[4724]: I1002 13:13:23.404772 4724 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-dm9x4"] Oct 02 13:13:23 crc kubenswrapper[4724]: I1002 13:13:23.408090 4724 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-dm9x4"] Oct 02 13:13:23 crc kubenswrapper[4724]: I1002 13:13:23.420589 4724 scope.go:117] "RemoveContainer" containerID="56fa8c718a5f92e7c90e27de47034a569e764ef994d9cfba075bef9dac044585" Oct 02 13:13:24 crc kubenswrapper[4724]: I1002 13:13:24.342059 4724 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="63560a5f-9ae8-4fb0-8f06-4f5c1550c25e" path="/var/lib/kubelet/pods/63560a5f-9ae8-4fb0-8f06-4f5c1550c25e/volumes" Oct 02 13:13:24 crc kubenswrapper[4724]: I1002 13:13:24.696360 4724 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e5907hp2b"] Oct 02 13:13:24 crc kubenswrapper[4724]: E1002 13:13:24.696652 4724 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="63560a5f-9ae8-4fb0-8f06-4f5c1550c25e" containerName="registry-server" Oct 02 13:13:24 crc kubenswrapper[4724]: I1002 13:13:24.696666 4724 state_mem.go:107] "Deleted CPUSet assignment" podUID="63560a5f-9ae8-4fb0-8f06-4f5c1550c25e" containerName="registry-server" Oct 02 13:13:24 crc kubenswrapper[4724]: E1002 13:13:24.696683 4724 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="abea5a78-a6fc-4a27-9595-c0ef2bbdcca9" containerName="registry-server" Oct 02 13:13:24 crc kubenswrapper[4724]: I1002 13:13:24.696691 4724 state_mem.go:107] "Deleted CPUSet assignment" podUID="abea5a78-a6fc-4a27-9595-c0ef2bbdcca9" containerName="registry-server" Oct 02 13:13:24 crc kubenswrapper[4724]: E1002 13:13:24.696712 4724 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="abea5a78-a6fc-4a27-9595-c0ef2bbdcca9" containerName="extract-utilities" Oct 02 13:13:24 crc kubenswrapper[4724]: I1002 13:13:24.696720 4724 state_mem.go:107] "Deleted CPUSet assignment" podUID="abea5a78-a6fc-4a27-9595-c0ef2bbdcca9" containerName="extract-utilities" Oct 02 13:13:24 crc kubenswrapper[4724]: E1002 13:13:24.696731 4724 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="63560a5f-9ae8-4fb0-8f06-4f5c1550c25e" containerName="extract-content" Oct 02 13:13:24 crc kubenswrapper[4724]: I1002 13:13:24.696738 4724 state_mem.go:107] "Deleted CPUSet assignment" podUID="63560a5f-9ae8-4fb0-8f06-4f5c1550c25e" containerName="extract-content" Oct 02 13:13:24 crc kubenswrapper[4724]: E1002 13:13:24.696753 4724 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="63560a5f-9ae8-4fb0-8f06-4f5c1550c25e" containerName="extract-utilities" Oct 02 13:13:24 crc kubenswrapper[4724]: I1002 13:13:24.696761 4724 state_mem.go:107] "Deleted CPUSet assignment" podUID="63560a5f-9ae8-4fb0-8f06-4f5c1550c25e" containerName="extract-utilities" Oct 02 13:13:24 crc kubenswrapper[4724]: E1002 13:13:24.696772 4724 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="abea5a78-a6fc-4a27-9595-c0ef2bbdcca9" containerName="extract-content" Oct 02 13:13:24 crc kubenswrapper[4724]: I1002 13:13:24.696780 4724 state_mem.go:107] "Deleted CPUSet assignment" podUID="abea5a78-a6fc-4a27-9595-c0ef2bbdcca9" containerName="extract-content" Oct 02 13:13:24 crc kubenswrapper[4724]: I1002 13:13:24.696907 4724 memory_manager.go:354] "RemoveStaleState removing state" podUID="63560a5f-9ae8-4fb0-8f06-4f5c1550c25e" containerName="registry-server" Oct 02 13:13:24 crc kubenswrapper[4724]: I1002 13:13:24.696921 4724 memory_manager.go:354] "RemoveStaleState removing state" podUID="abea5a78-a6fc-4a27-9595-c0ef2bbdcca9" containerName="registry-server" Oct 02 13:13:24 crc kubenswrapper[4724]: I1002 13:13:24.697881 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e5907hp2b" Oct 02 13:13:24 crc kubenswrapper[4724]: I1002 13:13:24.699706 4724 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"default-dockercfg-j9mmh" Oct 02 13:13:24 crc kubenswrapper[4724]: I1002 13:13:24.712069 4724 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e5907hp2b"] Oct 02 13:13:24 crc kubenswrapper[4724]: I1002 13:13:24.841026 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/0c83815e-5c93-45f4-9a44-48bb3e26f9c1-util\") pod \"9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e5907hp2b\" (UID: \"0c83815e-5c93-45f4-9a44-48bb3e26f9c1\") " pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e5907hp2b" Oct 02 13:13:24 crc kubenswrapper[4724]: I1002 13:13:24.841222 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/0c83815e-5c93-45f4-9a44-48bb3e26f9c1-bundle\") pod \"9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e5907hp2b\" (UID: \"0c83815e-5c93-45f4-9a44-48bb3e26f9c1\") " pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e5907hp2b" Oct 02 13:13:24 crc kubenswrapper[4724]: I1002 13:13:24.841261 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q5vdk\" (UniqueName: \"kubernetes.io/projected/0c83815e-5c93-45f4-9a44-48bb3e26f9c1-kube-api-access-q5vdk\") pod \"9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e5907hp2b\" (UID: \"0c83815e-5c93-45f4-9a44-48bb3e26f9c1\") " pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e5907hp2b" Oct 02 13:13:24 crc kubenswrapper[4724]: I1002 13:13:24.942706 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/0c83815e-5c93-45f4-9a44-48bb3e26f9c1-bundle\") pod \"9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e5907hp2b\" (UID: \"0c83815e-5c93-45f4-9a44-48bb3e26f9c1\") " pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e5907hp2b" Oct 02 13:13:24 crc kubenswrapper[4724]: I1002 13:13:24.942759 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q5vdk\" (UniqueName: \"kubernetes.io/projected/0c83815e-5c93-45f4-9a44-48bb3e26f9c1-kube-api-access-q5vdk\") pod \"9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e5907hp2b\" (UID: \"0c83815e-5c93-45f4-9a44-48bb3e26f9c1\") " pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e5907hp2b" Oct 02 13:13:24 crc kubenswrapper[4724]: I1002 13:13:24.942814 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/0c83815e-5c93-45f4-9a44-48bb3e26f9c1-util\") pod \"9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e5907hp2b\" (UID: \"0c83815e-5c93-45f4-9a44-48bb3e26f9c1\") " pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e5907hp2b" Oct 02 13:13:24 crc kubenswrapper[4724]: I1002 13:13:24.943328 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/0c83815e-5c93-45f4-9a44-48bb3e26f9c1-bundle\") pod \"9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e5907hp2b\" (UID: \"0c83815e-5c93-45f4-9a44-48bb3e26f9c1\") " pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e5907hp2b" Oct 02 13:13:24 crc kubenswrapper[4724]: I1002 13:13:24.943349 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/0c83815e-5c93-45f4-9a44-48bb3e26f9c1-util\") pod \"9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e5907hp2b\" (UID: \"0c83815e-5c93-45f4-9a44-48bb3e26f9c1\") " pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e5907hp2b" Oct 02 13:13:24 crc kubenswrapper[4724]: I1002 13:13:24.961289 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q5vdk\" (UniqueName: \"kubernetes.io/projected/0c83815e-5c93-45f4-9a44-48bb3e26f9c1-kube-api-access-q5vdk\") pod \"9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e5907hp2b\" (UID: \"0c83815e-5c93-45f4-9a44-48bb3e26f9c1\") " pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e5907hp2b" Oct 02 13:13:25 crc kubenswrapper[4724]: I1002 13:13:25.013447 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e5907hp2b" Oct 02 13:13:25 crc kubenswrapper[4724]: I1002 13:13:25.362219 4724 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/memcached-0" Oct 02 13:13:25 crc kubenswrapper[4724]: I1002 13:13:25.472201 4724 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e5907hp2b"] Oct 02 13:13:25 crc kubenswrapper[4724]: W1002 13:13:25.477093 4724 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0c83815e_5c93_45f4_9a44_48bb3e26f9c1.slice/crio-a5b5aaeac5cd91cde0fa75098a0c435d5868747ca62eae486d55aa21ba2da65f WatchSource:0}: Error finding container a5b5aaeac5cd91cde0fa75098a0c435d5868747ca62eae486d55aa21ba2da65f: Status 404 returned error can't find the container with id a5b5aaeac5cd91cde0fa75098a0c435d5868747ca62eae486d55aa21ba2da65f Oct 02 13:13:26 crc kubenswrapper[4724]: I1002 13:13:26.395936 4724 generic.go:334] "Generic (PLEG): container finished" podID="0c83815e-5c93-45f4-9a44-48bb3e26f9c1" containerID="6b832e969f9f763d4aaca0962421195f5e18fd8d9687b31861d0dcaa66c7e9cd" exitCode=0 Oct 02 13:13:26 crc kubenswrapper[4724]: I1002 13:13:26.396442 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e5907hp2b" event={"ID":"0c83815e-5c93-45f4-9a44-48bb3e26f9c1","Type":"ContainerDied","Data":"6b832e969f9f763d4aaca0962421195f5e18fd8d9687b31861d0dcaa66c7e9cd"} Oct 02 13:13:26 crc kubenswrapper[4724]: I1002 13:13:26.396469 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e5907hp2b" event={"ID":"0c83815e-5c93-45f4-9a44-48bb3e26f9c1","Type":"ContainerStarted","Data":"a5b5aaeac5cd91cde0fa75098a0c435d5868747ca62eae486d55aa21ba2da65f"} Oct 02 13:13:27 crc kubenswrapper[4724]: I1002 13:13:27.405449 4724 generic.go:334] "Generic (PLEG): container finished" podID="0c83815e-5c93-45f4-9a44-48bb3e26f9c1" containerID="ce65a412a39775fe7e65616fc48c0287baeb52b148b7fe369d6677c997d85263" exitCode=0 Oct 02 13:13:27 crc kubenswrapper[4724]: I1002 13:13:27.405571 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e5907hp2b" event={"ID":"0c83815e-5c93-45f4-9a44-48bb3e26f9c1","Type":"ContainerDied","Data":"ce65a412a39775fe7e65616fc48c0287baeb52b148b7fe369d6677c997d85263"} Oct 02 13:13:28 crc kubenswrapper[4724]: I1002 13:13:28.415602 4724 generic.go:334] "Generic (PLEG): container finished" podID="0c83815e-5c93-45f4-9a44-48bb3e26f9c1" containerID="a909c77e4e4b37937db7de8bbecec3163b894f4ad945c1e60140a28dd16f8bca" exitCode=0 Oct 02 13:13:28 crc kubenswrapper[4724]: I1002 13:13:28.415635 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e5907hp2b" event={"ID":"0c83815e-5c93-45f4-9a44-48bb3e26f9c1","Type":"ContainerDied","Data":"a909c77e4e4b37937db7de8bbecec3163b894f4ad945c1e60140a28dd16f8bca"} Oct 02 13:13:29 crc kubenswrapper[4724]: I1002 13:13:29.710202 4724 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e5907hp2b" Oct 02 13:13:29 crc kubenswrapper[4724]: I1002 13:13:29.911199 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/0c83815e-5c93-45f4-9a44-48bb3e26f9c1-util\") pod \"0c83815e-5c93-45f4-9a44-48bb3e26f9c1\" (UID: \"0c83815e-5c93-45f4-9a44-48bb3e26f9c1\") " Oct 02 13:13:29 crc kubenswrapper[4724]: I1002 13:13:29.911297 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q5vdk\" (UniqueName: \"kubernetes.io/projected/0c83815e-5c93-45f4-9a44-48bb3e26f9c1-kube-api-access-q5vdk\") pod \"0c83815e-5c93-45f4-9a44-48bb3e26f9c1\" (UID: \"0c83815e-5c93-45f4-9a44-48bb3e26f9c1\") " Oct 02 13:13:29 crc kubenswrapper[4724]: I1002 13:13:29.911422 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/0c83815e-5c93-45f4-9a44-48bb3e26f9c1-bundle\") pod \"0c83815e-5c93-45f4-9a44-48bb3e26f9c1\" (UID: \"0c83815e-5c93-45f4-9a44-48bb3e26f9c1\") " Oct 02 13:13:29 crc kubenswrapper[4724]: I1002 13:13:29.912344 4724 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0c83815e-5c93-45f4-9a44-48bb3e26f9c1-bundle" (OuterVolumeSpecName: "bundle") pod "0c83815e-5c93-45f4-9a44-48bb3e26f9c1" (UID: "0c83815e-5c93-45f4-9a44-48bb3e26f9c1"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 13:13:29 crc kubenswrapper[4724]: I1002 13:13:29.920509 4724 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0c83815e-5c93-45f4-9a44-48bb3e26f9c1-kube-api-access-q5vdk" (OuterVolumeSpecName: "kube-api-access-q5vdk") pod "0c83815e-5c93-45f4-9a44-48bb3e26f9c1" (UID: "0c83815e-5c93-45f4-9a44-48bb3e26f9c1"). InnerVolumeSpecName "kube-api-access-q5vdk". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 13:13:29 crc kubenswrapper[4724]: I1002 13:13:29.923071 4724 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0c83815e-5c93-45f4-9a44-48bb3e26f9c1-util" (OuterVolumeSpecName: "util") pod "0c83815e-5c93-45f4-9a44-48bb3e26f9c1" (UID: "0c83815e-5c93-45f4-9a44-48bb3e26f9c1"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 13:13:30 crc kubenswrapper[4724]: I1002 13:13:30.012515 4724 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/0c83815e-5c93-45f4-9a44-48bb3e26f9c1-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 13:13:30 crc kubenswrapper[4724]: I1002 13:13:30.012576 4724 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/0c83815e-5c93-45f4-9a44-48bb3e26f9c1-util\") on node \"crc\" DevicePath \"\"" Oct 02 13:13:30 crc kubenswrapper[4724]: I1002 13:13:30.012591 4724 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q5vdk\" (UniqueName: \"kubernetes.io/projected/0c83815e-5c93-45f4-9a44-48bb3e26f9c1-kube-api-access-q5vdk\") on node \"crc\" DevicePath \"\"" Oct 02 13:13:30 crc kubenswrapper[4724]: I1002 13:13:30.430594 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e5907hp2b" event={"ID":"0c83815e-5c93-45f4-9a44-48bb3e26f9c1","Type":"ContainerDied","Data":"a5b5aaeac5cd91cde0fa75098a0c435d5868747ca62eae486d55aa21ba2da65f"} Oct 02 13:13:30 crc kubenswrapper[4724]: I1002 13:13:30.430677 4724 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e5907hp2b" Oct 02 13:13:30 crc kubenswrapper[4724]: I1002 13:13:30.430632 4724 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a5b5aaeac5cd91cde0fa75098a0c435d5868747ca62eae486d55aa21ba2da65f" Oct 02 13:13:34 crc kubenswrapper[4724]: I1002 13:13:34.601628 4724 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="glance-kuttl-tests/openstack-galera-0" Oct 02 13:13:34 crc kubenswrapper[4724]: I1002 13:13:34.729864 4724 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/openstack-galera-0" Oct 02 13:13:36 crc kubenswrapper[4724]: I1002 13:13:36.649156 4724 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="glance-kuttl-tests/openstack-galera-1" Oct 02 13:13:36 crc kubenswrapper[4724]: I1002 13:13:36.690877 4724 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/openstack-galera-1" Oct 02 13:13:39 crc kubenswrapper[4724]: I1002 13:13:39.999070 4724 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-779fc9694b-nk8kk"] Oct 02 13:13:40 crc kubenswrapper[4724]: E1002 13:13:39.999670 4724 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0c83815e-5c93-45f4-9a44-48bb3e26f9c1" containerName="pull" Oct 02 13:13:40 crc kubenswrapper[4724]: I1002 13:13:39.999687 4724 state_mem.go:107] "Deleted CPUSet assignment" podUID="0c83815e-5c93-45f4-9a44-48bb3e26f9c1" containerName="pull" Oct 02 13:13:40 crc kubenswrapper[4724]: E1002 13:13:39.999711 4724 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0c83815e-5c93-45f4-9a44-48bb3e26f9c1" containerName="util" Oct 02 13:13:40 crc kubenswrapper[4724]: I1002 13:13:39.999719 4724 state_mem.go:107] "Deleted CPUSet assignment" podUID="0c83815e-5c93-45f4-9a44-48bb3e26f9c1" containerName="util" Oct 02 13:13:40 crc kubenswrapper[4724]: E1002 13:13:39.999737 4724 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0c83815e-5c93-45f4-9a44-48bb3e26f9c1" containerName="extract" Oct 02 13:13:40 crc kubenswrapper[4724]: I1002 13:13:39.999745 4724 state_mem.go:107] "Deleted CPUSet assignment" podUID="0c83815e-5c93-45f4-9a44-48bb3e26f9c1" containerName="extract" Oct 02 13:13:40 crc kubenswrapper[4724]: I1002 13:13:39.999948 4724 memory_manager.go:354] "RemoveStaleState removing state" podUID="0c83815e-5c93-45f4-9a44-48bb3e26f9c1" containerName="extract" Oct 02 13:13:40 crc kubenswrapper[4724]: I1002 13:13:40.000483 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-779fc9694b-nk8kk" Oct 02 13:13:40 crc kubenswrapper[4724]: I1002 13:13:40.004092 4724 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"rabbitmq-cluster-operator-dockercfg-f8r55" Oct 02 13:13:40 crc kubenswrapper[4724]: I1002 13:13:40.018110 4724 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-779fc9694b-nk8kk"] Oct 02 13:13:40 crc kubenswrapper[4724]: I1002 13:13:40.153221 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mqxn9\" (UniqueName: \"kubernetes.io/projected/7d397c20-cea7-4b10-8861-1b1da35bcc17-kube-api-access-mqxn9\") pod \"rabbitmq-cluster-operator-779fc9694b-nk8kk\" (UID: \"7d397c20-cea7-4b10-8861-1b1da35bcc17\") " pod="openstack-operators/rabbitmq-cluster-operator-779fc9694b-nk8kk" Oct 02 13:13:40 crc kubenswrapper[4724]: I1002 13:13:40.255382 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mqxn9\" (UniqueName: \"kubernetes.io/projected/7d397c20-cea7-4b10-8861-1b1da35bcc17-kube-api-access-mqxn9\") pod \"rabbitmq-cluster-operator-779fc9694b-nk8kk\" (UID: \"7d397c20-cea7-4b10-8861-1b1da35bcc17\") " pod="openstack-operators/rabbitmq-cluster-operator-779fc9694b-nk8kk" Oct 02 13:13:40 crc kubenswrapper[4724]: I1002 13:13:40.280262 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mqxn9\" (UniqueName: \"kubernetes.io/projected/7d397c20-cea7-4b10-8861-1b1da35bcc17-kube-api-access-mqxn9\") pod \"rabbitmq-cluster-operator-779fc9694b-nk8kk\" (UID: \"7d397c20-cea7-4b10-8861-1b1da35bcc17\") " pod="openstack-operators/rabbitmq-cluster-operator-779fc9694b-nk8kk" Oct 02 13:13:40 crc kubenswrapper[4724]: I1002 13:13:40.319566 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-779fc9694b-nk8kk" Oct 02 13:13:40 crc kubenswrapper[4724]: I1002 13:13:40.788552 4724 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-779fc9694b-nk8kk"] Oct 02 13:13:41 crc kubenswrapper[4724]: I1002 13:13:41.492216 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-779fc9694b-nk8kk" event={"ID":"7d397c20-cea7-4b10-8861-1b1da35bcc17","Type":"ContainerStarted","Data":"313fff60dc51e04ad67bcbb18875d900076b41c7c64846aadb618f1de15b794a"} Oct 02 13:13:43 crc kubenswrapper[4724]: I1002 13:13:43.503890 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-779fc9694b-nk8kk" event={"ID":"7d397c20-cea7-4b10-8861-1b1da35bcc17","Type":"ContainerStarted","Data":"a6e39436df185092fbb345d284f467162fd43a523e4df32e22b6d7799fc12c4c"} Oct 02 13:13:43 crc kubenswrapper[4724]: I1002 13:13:43.524339 4724 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/rabbitmq-cluster-operator-779fc9694b-nk8kk" podStartSLOduration=2.523245732 podStartE2EDuration="4.524310728s" podCreationTimestamp="2025-10-02 13:13:39 +0000 UTC" firstStartedPulling="2025-10-02 13:13:40.805041718 +0000 UTC m=+885.259800839" lastFinishedPulling="2025-10-02 13:13:42.806106704 +0000 UTC m=+887.260865835" observedRunningTime="2025-10-02 13:13:43.518403402 +0000 UTC m=+887.973162533" watchObservedRunningTime="2025-10-02 13:13:43.524310728 +0000 UTC m=+887.979069889" Oct 02 13:13:47 crc kubenswrapper[4724]: I1002 13:13:47.645464 4724 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/rabbitmq-server-0"] Oct 02 13:13:47 crc kubenswrapper[4724]: I1002 13:13:47.647414 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/rabbitmq-server-0" Oct 02 13:13:47 crc kubenswrapper[4724]: I1002 13:13:47.653214 4724 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"rabbitmq-server-dockercfg-755b6" Oct 02 13:13:47 crc kubenswrapper[4724]: I1002 13:13:47.653529 4724 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"rabbitmq-default-user" Oct 02 13:13:47 crc kubenswrapper[4724]: I1002 13:13:47.653720 4724 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"rabbitmq-erlang-cookie" Oct 02 13:13:47 crc kubenswrapper[4724]: I1002 13:13:47.653883 4724 reflector.go:368] Caches populated for *v1.ConfigMap from object-"glance-kuttl-tests"/"rabbitmq-server-conf" Oct 02 13:13:47 crc kubenswrapper[4724]: I1002 13:13:47.654307 4724 reflector.go:368] Caches populated for *v1.ConfigMap from object-"glance-kuttl-tests"/"rabbitmq-plugins-conf" Oct 02 13:13:47 crc kubenswrapper[4724]: I1002 13:13:47.670000 4724 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/rabbitmq-server-0"] Oct 02 13:13:47 crc kubenswrapper[4724]: I1002 13:13:47.760284 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p8t9c\" (UniqueName: \"kubernetes.io/projected/cfb3d3fc-ef69-4586-89af-7b9d221d61d7-kube-api-access-p8t9c\") pod \"rabbitmq-server-0\" (UID: \"cfb3d3fc-ef69-4586-89af-7b9d221d61d7\") " pod="glance-kuttl-tests/rabbitmq-server-0" Oct 02 13:13:47 crc kubenswrapper[4724]: I1002 13:13:47.760357 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/cfb3d3fc-ef69-4586-89af-7b9d221d61d7-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"cfb3d3fc-ef69-4586-89af-7b9d221d61d7\") " pod="glance-kuttl-tests/rabbitmq-server-0" Oct 02 13:13:47 crc kubenswrapper[4724]: I1002 13:13:47.760382 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/cfb3d3fc-ef69-4586-89af-7b9d221d61d7-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"cfb3d3fc-ef69-4586-89af-7b9d221d61d7\") " pod="glance-kuttl-tests/rabbitmq-server-0" Oct 02 13:13:47 crc kubenswrapper[4724]: I1002 13:13:47.760403 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/cfb3d3fc-ef69-4586-89af-7b9d221d61d7-pod-info\") pod \"rabbitmq-server-0\" (UID: \"cfb3d3fc-ef69-4586-89af-7b9d221d61d7\") " pod="glance-kuttl-tests/rabbitmq-server-0" Oct 02 13:13:47 crc kubenswrapper[4724]: I1002 13:13:47.760460 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-93339f9f-ef74-4913-829f-f606f70dd1ce\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-93339f9f-ef74-4913-829f-f606f70dd1ce\") pod \"rabbitmq-server-0\" (UID: \"cfb3d3fc-ef69-4586-89af-7b9d221d61d7\") " pod="glance-kuttl-tests/rabbitmq-server-0" Oct 02 13:13:47 crc kubenswrapper[4724]: I1002 13:13:47.760514 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/cfb3d3fc-ef69-4586-89af-7b9d221d61d7-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"cfb3d3fc-ef69-4586-89af-7b9d221d61d7\") " pod="glance-kuttl-tests/rabbitmq-server-0" Oct 02 13:13:47 crc kubenswrapper[4724]: I1002 13:13:47.760560 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/cfb3d3fc-ef69-4586-89af-7b9d221d61d7-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"cfb3d3fc-ef69-4586-89af-7b9d221d61d7\") " pod="glance-kuttl-tests/rabbitmq-server-0" Oct 02 13:13:47 crc kubenswrapper[4724]: I1002 13:13:47.760580 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/cfb3d3fc-ef69-4586-89af-7b9d221d61d7-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"cfb3d3fc-ef69-4586-89af-7b9d221d61d7\") " pod="glance-kuttl-tests/rabbitmq-server-0" Oct 02 13:13:47 crc kubenswrapper[4724]: I1002 13:13:47.862213 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/cfb3d3fc-ef69-4586-89af-7b9d221d61d7-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"cfb3d3fc-ef69-4586-89af-7b9d221d61d7\") " pod="glance-kuttl-tests/rabbitmq-server-0" Oct 02 13:13:47 crc kubenswrapper[4724]: I1002 13:13:47.862285 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/cfb3d3fc-ef69-4586-89af-7b9d221d61d7-pod-info\") pod \"rabbitmq-server-0\" (UID: \"cfb3d3fc-ef69-4586-89af-7b9d221d61d7\") " pod="glance-kuttl-tests/rabbitmq-server-0" Oct 02 13:13:47 crc kubenswrapper[4724]: I1002 13:13:47.862351 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-93339f9f-ef74-4913-829f-f606f70dd1ce\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-93339f9f-ef74-4913-829f-f606f70dd1ce\") pod \"rabbitmq-server-0\" (UID: \"cfb3d3fc-ef69-4586-89af-7b9d221d61d7\") " pod="glance-kuttl-tests/rabbitmq-server-0" Oct 02 13:13:47 crc kubenswrapper[4724]: I1002 13:13:47.862404 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/cfb3d3fc-ef69-4586-89af-7b9d221d61d7-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"cfb3d3fc-ef69-4586-89af-7b9d221d61d7\") " pod="glance-kuttl-tests/rabbitmq-server-0" Oct 02 13:13:47 crc kubenswrapper[4724]: I1002 13:13:47.862428 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/cfb3d3fc-ef69-4586-89af-7b9d221d61d7-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"cfb3d3fc-ef69-4586-89af-7b9d221d61d7\") " pod="glance-kuttl-tests/rabbitmq-server-0" Oct 02 13:13:47 crc kubenswrapper[4724]: I1002 13:13:47.862446 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/cfb3d3fc-ef69-4586-89af-7b9d221d61d7-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"cfb3d3fc-ef69-4586-89af-7b9d221d61d7\") " pod="glance-kuttl-tests/rabbitmq-server-0" Oct 02 13:13:47 crc kubenswrapper[4724]: I1002 13:13:47.862490 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p8t9c\" (UniqueName: \"kubernetes.io/projected/cfb3d3fc-ef69-4586-89af-7b9d221d61d7-kube-api-access-p8t9c\") pod \"rabbitmq-server-0\" (UID: \"cfb3d3fc-ef69-4586-89af-7b9d221d61d7\") " pod="glance-kuttl-tests/rabbitmq-server-0" Oct 02 13:13:47 crc kubenswrapper[4724]: I1002 13:13:47.862505 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/cfb3d3fc-ef69-4586-89af-7b9d221d61d7-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"cfb3d3fc-ef69-4586-89af-7b9d221d61d7\") " pod="glance-kuttl-tests/rabbitmq-server-0" Oct 02 13:13:47 crc kubenswrapper[4724]: I1002 13:13:47.862756 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/cfb3d3fc-ef69-4586-89af-7b9d221d61d7-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"cfb3d3fc-ef69-4586-89af-7b9d221d61d7\") " pod="glance-kuttl-tests/rabbitmq-server-0" Oct 02 13:13:47 crc kubenswrapper[4724]: I1002 13:13:47.864145 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/cfb3d3fc-ef69-4586-89af-7b9d221d61d7-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"cfb3d3fc-ef69-4586-89af-7b9d221d61d7\") " pod="glance-kuttl-tests/rabbitmq-server-0" Oct 02 13:13:47 crc kubenswrapper[4724]: I1002 13:13:47.864419 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/cfb3d3fc-ef69-4586-89af-7b9d221d61d7-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"cfb3d3fc-ef69-4586-89af-7b9d221d61d7\") " pod="glance-kuttl-tests/rabbitmq-server-0" Oct 02 13:13:47 crc kubenswrapper[4724]: I1002 13:13:47.867353 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/cfb3d3fc-ef69-4586-89af-7b9d221d61d7-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"cfb3d3fc-ef69-4586-89af-7b9d221d61d7\") " pod="glance-kuttl-tests/rabbitmq-server-0" Oct 02 13:13:47 crc kubenswrapper[4724]: I1002 13:13:47.868902 4724 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Oct 02 13:13:47 crc kubenswrapper[4724]: I1002 13:13:47.868949 4724 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-93339f9f-ef74-4913-829f-f606f70dd1ce\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-93339f9f-ef74-4913-829f-f606f70dd1ce\") pod \"rabbitmq-server-0\" (UID: \"cfb3d3fc-ef69-4586-89af-7b9d221d61d7\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/393643fb7b1912211c73251337277b797b656ba1239a640fbf45112e9fa2e19b/globalmount\"" pod="glance-kuttl-tests/rabbitmq-server-0" Oct 02 13:13:47 crc kubenswrapper[4724]: I1002 13:13:47.870264 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/cfb3d3fc-ef69-4586-89af-7b9d221d61d7-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"cfb3d3fc-ef69-4586-89af-7b9d221d61d7\") " pod="glance-kuttl-tests/rabbitmq-server-0" Oct 02 13:13:47 crc kubenswrapper[4724]: I1002 13:13:47.880352 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/cfb3d3fc-ef69-4586-89af-7b9d221d61d7-pod-info\") pod \"rabbitmq-server-0\" (UID: \"cfb3d3fc-ef69-4586-89af-7b9d221d61d7\") " pod="glance-kuttl-tests/rabbitmq-server-0" Oct 02 13:13:47 crc kubenswrapper[4724]: I1002 13:13:47.881442 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p8t9c\" (UniqueName: \"kubernetes.io/projected/cfb3d3fc-ef69-4586-89af-7b9d221d61d7-kube-api-access-p8t9c\") pod \"rabbitmq-server-0\" (UID: \"cfb3d3fc-ef69-4586-89af-7b9d221d61d7\") " pod="glance-kuttl-tests/rabbitmq-server-0" Oct 02 13:13:47 crc kubenswrapper[4724]: I1002 13:13:47.921566 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-93339f9f-ef74-4913-829f-f606f70dd1ce\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-93339f9f-ef74-4913-829f-f606f70dd1ce\") pod \"rabbitmq-server-0\" (UID: \"cfb3d3fc-ef69-4586-89af-7b9d221d61d7\") " pod="glance-kuttl-tests/rabbitmq-server-0" Oct 02 13:13:47 crc kubenswrapper[4724]: I1002 13:13:47.968689 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/rabbitmq-server-0" Oct 02 13:13:48 crc kubenswrapper[4724]: I1002 13:13:48.207630 4724 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/rabbitmq-server-0"] Oct 02 13:13:48 crc kubenswrapper[4724]: I1002 13:13:48.542250 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/rabbitmq-server-0" event={"ID":"cfb3d3fc-ef69-4586-89af-7b9d221d61d7","Type":"ContainerStarted","Data":"6f204292cd5e4a5fd7aefa85cb499dd87a8bd49257c31b38f9190a58661af1f0"} Oct 02 13:13:49 crc kubenswrapper[4724]: I1002 13:13:49.239442 4724 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/keystone-operator-index-6ctwc"] Oct 02 13:13:49 crc kubenswrapper[4724]: I1002 13:13:49.241145 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-index-6ctwc" Oct 02 13:13:49 crc kubenswrapper[4724]: I1002 13:13:49.243487 4724 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"keystone-operator-index-dockercfg-g2ntp" Oct 02 13:13:49 crc kubenswrapper[4724]: I1002 13:13:49.252191 4724 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-index-6ctwc"] Oct 02 13:13:49 crc kubenswrapper[4724]: I1002 13:13:49.383156 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t7bv8\" (UniqueName: \"kubernetes.io/projected/f7081f4f-6b12-4a89-b7bc-e24117cbf951-kube-api-access-t7bv8\") pod \"keystone-operator-index-6ctwc\" (UID: \"f7081f4f-6b12-4a89-b7bc-e24117cbf951\") " pod="openstack-operators/keystone-operator-index-6ctwc" Oct 02 13:13:49 crc kubenswrapper[4724]: I1002 13:13:49.484147 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t7bv8\" (UniqueName: \"kubernetes.io/projected/f7081f4f-6b12-4a89-b7bc-e24117cbf951-kube-api-access-t7bv8\") pod \"keystone-operator-index-6ctwc\" (UID: \"f7081f4f-6b12-4a89-b7bc-e24117cbf951\") " pod="openstack-operators/keystone-operator-index-6ctwc" Oct 02 13:13:49 crc kubenswrapper[4724]: I1002 13:13:49.508051 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t7bv8\" (UniqueName: \"kubernetes.io/projected/f7081f4f-6b12-4a89-b7bc-e24117cbf951-kube-api-access-t7bv8\") pod \"keystone-operator-index-6ctwc\" (UID: \"f7081f4f-6b12-4a89-b7bc-e24117cbf951\") " pod="openstack-operators/keystone-operator-index-6ctwc" Oct 02 13:13:49 crc kubenswrapper[4724]: I1002 13:13:49.558440 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-index-6ctwc" Oct 02 13:13:49 crc kubenswrapper[4724]: I1002 13:13:49.964365 4724 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-index-6ctwc"] Oct 02 13:13:49 crc kubenswrapper[4724]: W1002 13:13:49.974897 4724 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf7081f4f_6b12_4a89_b7bc_e24117cbf951.slice/crio-db8a105dbbaadccb40ed518c4da29762de18ddce7fbedf8b2a5a603383ff5099 WatchSource:0}: Error finding container db8a105dbbaadccb40ed518c4da29762de18ddce7fbedf8b2a5a603383ff5099: Status 404 returned error can't find the container with id db8a105dbbaadccb40ed518c4da29762de18ddce7fbedf8b2a5a603383ff5099 Oct 02 13:13:50 crc kubenswrapper[4724]: I1002 13:13:50.562446 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-index-6ctwc" event={"ID":"f7081f4f-6b12-4a89-b7bc-e24117cbf951","Type":"ContainerStarted","Data":"db8a105dbbaadccb40ed518c4da29762de18ddce7fbedf8b2a5a603383ff5099"} Oct 02 13:13:51 crc kubenswrapper[4724]: I1002 13:13:51.571222 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-index-6ctwc" event={"ID":"f7081f4f-6b12-4a89-b7bc-e24117cbf951","Type":"ContainerStarted","Data":"56defe93f3f014529870fd7b33928c8fe16c4fb22079d8c624227233393f3d5e"} Oct 02 13:13:51 crc kubenswrapper[4724]: I1002 13:13:51.594843 4724 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/keystone-operator-index-6ctwc" podStartSLOduration=1.582207758 podStartE2EDuration="2.594822935s" podCreationTimestamp="2025-10-02 13:13:49 +0000 UTC" firstStartedPulling="2025-10-02 13:13:49.978095556 +0000 UTC m=+894.432854677" lastFinishedPulling="2025-10-02 13:13:50.990710733 +0000 UTC m=+895.445469854" observedRunningTime="2025-10-02 13:13:51.58744104 +0000 UTC m=+896.042200161" watchObservedRunningTime="2025-10-02 13:13:51.594822935 +0000 UTC m=+896.049582056" Oct 02 13:13:56 crc kubenswrapper[4724]: I1002 13:13:56.604042 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/rabbitmq-server-0" event={"ID":"cfb3d3fc-ef69-4586-89af-7b9d221d61d7","Type":"ContainerStarted","Data":"94b79f26bc2c765fe3e961d98832dbe8c82eb23e1ed94f918673ac41f48bfefe"} Oct 02 13:13:59 crc kubenswrapper[4724]: I1002 13:13:59.558738 4724 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/keystone-operator-index-6ctwc" Oct 02 13:13:59 crc kubenswrapper[4724]: I1002 13:13:59.559039 4724 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/keystone-operator-index-6ctwc" Oct 02 13:13:59 crc kubenswrapper[4724]: I1002 13:13:59.589010 4724 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/keystone-operator-index-6ctwc" Oct 02 13:13:59 crc kubenswrapper[4724]: I1002 13:13:59.682679 4724 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/keystone-operator-index-6ctwc" Oct 02 13:14:07 crc kubenswrapper[4724]: I1002 13:14:07.318678 4724 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/d93b99dddc714b0f4b2148f40016b9ead21cc18743d58ffe812e1bd436nj67f"] Oct 02 13:14:07 crc kubenswrapper[4724]: I1002 13:14:07.321002 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/d93b99dddc714b0f4b2148f40016b9ead21cc18743d58ffe812e1bd436nj67f" Oct 02 13:14:07 crc kubenswrapper[4724]: I1002 13:14:07.324024 4724 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"default-dockercfg-j9mmh" Oct 02 13:14:07 crc kubenswrapper[4724]: I1002 13:14:07.341854 4724 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/d93b99dddc714b0f4b2148f40016b9ead21cc18743d58ffe812e1bd436nj67f"] Oct 02 13:14:07 crc kubenswrapper[4724]: I1002 13:14:07.479804 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c72227ad-ec88-4c28-b96e-989d90f420e8-util\") pod \"d93b99dddc714b0f4b2148f40016b9ead21cc18743d58ffe812e1bd436nj67f\" (UID: \"c72227ad-ec88-4c28-b96e-989d90f420e8\") " pod="openstack-operators/d93b99dddc714b0f4b2148f40016b9ead21cc18743d58ffe812e1bd436nj67f" Oct 02 13:14:07 crc kubenswrapper[4724]: I1002 13:14:07.480436 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gzjdk\" (UniqueName: \"kubernetes.io/projected/c72227ad-ec88-4c28-b96e-989d90f420e8-kube-api-access-gzjdk\") pod \"d93b99dddc714b0f4b2148f40016b9ead21cc18743d58ffe812e1bd436nj67f\" (UID: \"c72227ad-ec88-4c28-b96e-989d90f420e8\") " pod="openstack-operators/d93b99dddc714b0f4b2148f40016b9ead21cc18743d58ffe812e1bd436nj67f" Oct 02 13:14:07 crc kubenswrapper[4724]: I1002 13:14:07.480511 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c72227ad-ec88-4c28-b96e-989d90f420e8-bundle\") pod \"d93b99dddc714b0f4b2148f40016b9ead21cc18743d58ffe812e1bd436nj67f\" (UID: \"c72227ad-ec88-4c28-b96e-989d90f420e8\") " pod="openstack-operators/d93b99dddc714b0f4b2148f40016b9ead21cc18743d58ffe812e1bd436nj67f" Oct 02 13:14:07 crc kubenswrapper[4724]: I1002 13:14:07.582670 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c72227ad-ec88-4c28-b96e-989d90f420e8-util\") pod \"d93b99dddc714b0f4b2148f40016b9ead21cc18743d58ffe812e1bd436nj67f\" (UID: \"c72227ad-ec88-4c28-b96e-989d90f420e8\") " pod="openstack-operators/d93b99dddc714b0f4b2148f40016b9ead21cc18743d58ffe812e1bd436nj67f" Oct 02 13:14:07 crc kubenswrapper[4724]: I1002 13:14:07.582814 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gzjdk\" (UniqueName: \"kubernetes.io/projected/c72227ad-ec88-4c28-b96e-989d90f420e8-kube-api-access-gzjdk\") pod \"d93b99dddc714b0f4b2148f40016b9ead21cc18743d58ffe812e1bd436nj67f\" (UID: \"c72227ad-ec88-4c28-b96e-989d90f420e8\") " pod="openstack-operators/d93b99dddc714b0f4b2148f40016b9ead21cc18743d58ffe812e1bd436nj67f" Oct 02 13:14:07 crc kubenswrapper[4724]: I1002 13:14:07.582926 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c72227ad-ec88-4c28-b96e-989d90f420e8-bundle\") pod \"d93b99dddc714b0f4b2148f40016b9ead21cc18743d58ffe812e1bd436nj67f\" (UID: \"c72227ad-ec88-4c28-b96e-989d90f420e8\") " pod="openstack-operators/d93b99dddc714b0f4b2148f40016b9ead21cc18743d58ffe812e1bd436nj67f" Oct 02 13:14:07 crc kubenswrapper[4724]: I1002 13:14:07.583340 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c72227ad-ec88-4c28-b96e-989d90f420e8-bundle\") pod \"d93b99dddc714b0f4b2148f40016b9ead21cc18743d58ffe812e1bd436nj67f\" (UID: \"c72227ad-ec88-4c28-b96e-989d90f420e8\") " pod="openstack-operators/d93b99dddc714b0f4b2148f40016b9ead21cc18743d58ffe812e1bd436nj67f" Oct 02 13:14:07 crc kubenswrapper[4724]: I1002 13:14:07.583507 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c72227ad-ec88-4c28-b96e-989d90f420e8-util\") pod \"d93b99dddc714b0f4b2148f40016b9ead21cc18743d58ffe812e1bd436nj67f\" (UID: \"c72227ad-ec88-4c28-b96e-989d90f420e8\") " pod="openstack-operators/d93b99dddc714b0f4b2148f40016b9ead21cc18743d58ffe812e1bd436nj67f" Oct 02 13:14:07 crc kubenswrapper[4724]: I1002 13:14:07.620451 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gzjdk\" (UniqueName: \"kubernetes.io/projected/c72227ad-ec88-4c28-b96e-989d90f420e8-kube-api-access-gzjdk\") pod \"d93b99dddc714b0f4b2148f40016b9ead21cc18743d58ffe812e1bd436nj67f\" (UID: \"c72227ad-ec88-4c28-b96e-989d90f420e8\") " pod="openstack-operators/d93b99dddc714b0f4b2148f40016b9ead21cc18743d58ffe812e1bd436nj67f" Oct 02 13:14:07 crc kubenswrapper[4724]: I1002 13:14:07.641611 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/d93b99dddc714b0f4b2148f40016b9ead21cc18743d58ffe812e1bd436nj67f" Oct 02 13:14:08 crc kubenswrapper[4724]: I1002 13:14:08.060602 4724 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/d93b99dddc714b0f4b2148f40016b9ead21cc18743d58ffe812e1bd436nj67f"] Oct 02 13:14:08 crc kubenswrapper[4724]: W1002 13:14:08.065788 4724 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc72227ad_ec88_4c28_b96e_989d90f420e8.slice/crio-c1f9d9f2fa9c152c8a3b1c699b32ddba6bd3aeacf92260dc458a9189d48e81af WatchSource:0}: Error finding container c1f9d9f2fa9c152c8a3b1c699b32ddba6bd3aeacf92260dc458a9189d48e81af: Status 404 returned error can't find the container with id c1f9d9f2fa9c152c8a3b1c699b32ddba6bd3aeacf92260dc458a9189d48e81af Oct 02 13:14:08 crc kubenswrapper[4724]: I1002 13:14:08.721529 4724 generic.go:334] "Generic (PLEG): container finished" podID="c72227ad-ec88-4c28-b96e-989d90f420e8" containerID="87c8cefdef1b0d274d86b649d01b3731483981dd7c89b2843ccd5e331030a79e" exitCode=0 Oct 02 13:14:08 crc kubenswrapper[4724]: I1002 13:14:08.721645 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/d93b99dddc714b0f4b2148f40016b9ead21cc18743d58ffe812e1bd436nj67f" event={"ID":"c72227ad-ec88-4c28-b96e-989d90f420e8","Type":"ContainerDied","Data":"87c8cefdef1b0d274d86b649d01b3731483981dd7c89b2843ccd5e331030a79e"} Oct 02 13:14:08 crc kubenswrapper[4724]: I1002 13:14:08.721687 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/d93b99dddc714b0f4b2148f40016b9ead21cc18743d58ffe812e1bd436nj67f" event={"ID":"c72227ad-ec88-4c28-b96e-989d90f420e8","Type":"ContainerStarted","Data":"c1f9d9f2fa9c152c8a3b1c699b32ddba6bd3aeacf92260dc458a9189d48e81af"} Oct 02 13:14:09 crc kubenswrapper[4724]: I1002 13:14:09.731580 4724 generic.go:334] "Generic (PLEG): container finished" podID="c72227ad-ec88-4c28-b96e-989d90f420e8" containerID="4885377a8cdd6d8e45dd6412d9a2afe5145d70cbb51c7f26fa793c69a733b2aa" exitCode=0 Oct 02 13:14:09 crc kubenswrapper[4724]: I1002 13:14:09.731644 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/d93b99dddc714b0f4b2148f40016b9ead21cc18743d58ffe812e1bd436nj67f" event={"ID":"c72227ad-ec88-4c28-b96e-989d90f420e8","Type":"ContainerDied","Data":"4885377a8cdd6d8e45dd6412d9a2afe5145d70cbb51c7f26fa793c69a733b2aa"} Oct 02 13:14:10 crc kubenswrapper[4724]: I1002 13:14:10.741693 4724 generic.go:334] "Generic (PLEG): container finished" podID="c72227ad-ec88-4c28-b96e-989d90f420e8" containerID="05030220eacb7da414df1dbd5598708240dc357f3424b906e53ab5ce5b641744" exitCode=0 Oct 02 13:14:10 crc kubenswrapper[4724]: I1002 13:14:10.741919 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/d93b99dddc714b0f4b2148f40016b9ead21cc18743d58ffe812e1bd436nj67f" event={"ID":"c72227ad-ec88-4c28-b96e-989d90f420e8","Type":"ContainerDied","Data":"05030220eacb7da414df1dbd5598708240dc357f3424b906e53ab5ce5b641744"} Oct 02 13:14:11 crc kubenswrapper[4724]: I1002 13:14:11.994748 4724 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/d93b99dddc714b0f4b2148f40016b9ead21cc18743d58ffe812e1bd436nj67f" Oct 02 13:14:12 crc kubenswrapper[4724]: I1002 13:14:12.160451 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c72227ad-ec88-4c28-b96e-989d90f420e8-bundle\") pod \"c72227ad-ec88-4c28-b96e-989d90f420e8\" (UID: \"c72227ad-ec88-4c28-b96e-989d90f420e8\") " Oct 02 13:14:12 crc kubenswrapper[4724]: I1002 13:14:12.160631 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c72227ad-ec88-4c28-b96e-989d90f420e8-util\") pod \"c72227ad-ec88-4c28-b96e-989d90f420e8\" (UID: \"c72227ad-ec88-4c28-b96e-989d90f420e8\") " Oct 02 13:14:12 crc kubenswrapper[4724]: I1002 13:14:12.160682 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gzjdk\" (UniqueName: \"kubernetes.io/projected/c72227ad-ec88-4c28-b96e-989d90f420e8-kube-api-access-gzjdk\") pod \"c72227ad-ec88-4c28-b96e-989d90f420e8\" (UID: \"c72227ad-ec88-4c28-b96e-989d90f420e8\") " Oct 02 13:14:12 crc kubenswrapper[4724]: I1002 13:14:12.161430 4724 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c72227ad-ec88-4c28-b96e-989d90f420e8-bundle" (OuterVolumeSpecName: "bundle") pod "c72227ad-ec88-4c28-b96e-989d90f420e8" (UID: "c72227ad-ec88-4c28-b96e-989d90f420e8"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 13:14:12 crc kubenswrapper[4724]: I1002 13:14:12.170458 4724 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c72227ad-ec88-4c28-b96e-989d90f420e8-kube-api-access-gzjdk" (OuterVolumeSpecName: "kube-api-access-gzjdk") pod "c72227ad-ec88-4c28-b96e-989d90f420e8" (UID: "c72227ad-ec88-4c28-b96e-989d90f420e8"). InnerVolumeSpecName "kube-api-access-gzjdk". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 13:14:12 crc kubenswrapper[4724]: I1002 13:14:12.173785 4724 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c72227ad-ec88-4c28-b96e-989d90f420e8-util" (OuterVolumeSpecName: "util") pod "c72227ad-ec88-4c28-b96e-989d90f420e8" (UID: "c72227ad-ec88-4c28-b96e-989d90f420e8"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 13:14:12 crc kubenswrapper[4724]: I1002 13:14:12.262256 4724 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c72227ad-ec88-4c28-b96e-989d90f420e8-util\") on node \"crc\" DevicePath \"\"" Oct 02 13:14:12 crc kubenswrapper[4724]: I1002 13:14:12.262301 4724 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gzjdk\" (UniqueName: \"kubernetes.io/projected/c72227ad-ec88-4c28-b96e-989d90f420e8-kube-api-access-gzjdk\") on node \"crc\" DevicePath \"\"" Oct 02 13:14:12 crc kubenswrapper[4724]: I1002 13:14:12.262314 4724 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c72227ad-ec88-4c28-b96e-989d90f420e8-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 13:14:12 crc kubenswrapper[4724]: I1002 13:14:12.756070 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/d93b99dddc714b0f4b2148f40016b9ead21cc18743d58ffe812e1bd436nj67f" event={"ID":"c72227ad-ec88-4c28-b96e-989d90f420e8","Type":"ContainerDied","Data":"c1f9d9f2fa9c152c8a3b1c699b32ddba6bd3aeacf92260dc458a9189d48e81af"} Oct 02 13:14:12 crc kubenswrapper[4724]: I1002 13:14:12.756113 4724 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c1f9d9f2fa9c152c8a3b1c699b32ddba6bd3aeacf92260dc458a9189d48e81af" Oct 02 13:14:12 crc kubenswrapper[4724]: I1002 13:14:12.756193 4724 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/d93b99dddc714b0f4b2148f40016b9ead21cc18743d58ffe812e1bd436nj67f" Oct 02 13:14:23 crc kubenswrapper[4724]: I1002 13:14:23.113209 4724 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/keystone-operator-controller-manager-c48d5fbd5-pt768"] Oct 02 13:14:23 crc kubenswrapper[4724]: E1002 13:14:23.114076 4724 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c72227ad-ec88-4c28-b96e-989d90f420e8" containerName="util" Oct 02 13:14:23 crc kubenswrapper[4724]: I1002 13:14:23.114089 4724 state_mem.go:107] "Deleted CPUSet assignment" podUID="c72227ad-ec88-4c28-b96e-989d90f420e8" containerName="util" Oct 02 13:14:23 crc kubenswrapper[4724]: E1002 13:14:23.114099 4724 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c72227ad-ec88-4c28-b96e-989d90f420e8" containerName="extract" Oct 02 13:14:23 crc kubenswrapper[4724]: I1002 13:14:23.114105 4724 state_mem.go:107] "Deleted CPUSet assignment" podUID="c72227ad-ec88-4c28-b96e-989d90f420e8" containerName="extract" Oct 02 13:14:23 crc kubenswrapper[4724]: E1002 13:14:23.114124 4724 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c72227ad-ec88-4c28-b96e-989d90f420e8" containerName="pull" Oct 02 13:14:23 crc kubenswrapper[4724]: I1002 13:14:23.114129 4724 state_mem.go:107] "Deleted CPUSet assignment" podUID="c72227ad-ec88-4c28-b96e-989d90f420e8" containerName="pull" Oct 02 13:14:23 crc kubenswrapper[4724]: I1002 13:14:23.114225 4724 memory_manager.go:354] "RemoveStaleState removing state" podUID="c72227ad-ec88-4c28-b96e-989d90f420e8" containerName="extract" Oct 02 13:14:23 crc kubenswrapper[4724]: I1002 13:14:23.114845 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-c48d5fbd5-pt768" Oct 02 13:14:23 crc kubenswrapper[4724]: I1002 13:14:23.117185 4724 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"keystone-operator-controller-manager-dockercfg-z4pzz" Oct 02 13:14:23 crc kubenswrapper[4724]: I1002 13:14:23.118669 4724 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"keystone-operator-controller-manager-service-cert" Oct 02 13:14:23 crc kubenswrapper[4724]: I1002 13:14:23.138297 4724 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-c48d5fbd5-pt768"] Oct 02 13:14:23 crc kubenswrapper[4724]: I1002 13:14:23.222511 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/b1ea0368-8cba-4ff8-bb45-2c1b377b3a2d-webhook-cert\") pod \"keystone-operator-controller-manager-c48d5fbd5-pt768\" (UID: \"b1ea0368-8cba-4ff8-bb45-2c1b377b3a2d\") " pod="openstack-operators/keystone-operator-controller-manager-c48d5fbd5-pt768" Oct 02 13:14:23 crc kubenswrapper[4724]: I1002 13:14:23.222602 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xk4bq\" (UniqueName: \"kubernetes.io/projected/b1ea0368-8cba-4ff8-bb45-2c1b377b3a2d-kube-api-access-xk4bq\") pod \"keystone-operator-controller-manager-c48d5fbd5-pt768\" (UID: \"b1ea0368-8cba-4ff8-bb45-2c1b377b3a2d\") " pod="openstack-operators/keystone-operator-controller-manager-c48d5fbd5-pt768" Oct 02 13:14:23 crc kubenswrapper[4724]: I1002 13:14:23.223117 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/b1ea0368-8cba-4ff8-bb45-2c1b377b3a2d-apiservice-cert\") pod \"keystone-operator-controller-manager-c48d5fbd5-pt768\" (UID: \"b1ea0368-8cba-4ff8-bb45-2c1b377b3a2d\") " pod="openstack-operators/keystone-operator-controller-manager-c48d5fbd5-pt768" Oct 02 13:14:23 crc kubenswrapper[4724]: I1002 13:14:23.324138 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/b1ea0368-8cba-4ff8-bb45-2c1b377b3a2d-apiservice-cert\") pod \"keystone-operator-controller-manager-c48d5fbd5-pt768\" (UID: \"b1ea0368-8cba-4ff8-bb45-2c1b377b3a2d\") " pod="openstack-operators/keystone-operator-controller-manager-c48d5fbd5-pt768" Oct 02 13:14:23 crc kubenswrapper[4724]: I1002 13:14:23.324202 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/b1ea0368-8cba-4ff8-bb45-2c1b377b3a2d-webhook-cert\") pod \"keystone-operator-controller-manager-c48d5fbd5-pt768\" (UID: \"b1ea0368-8cba-4ff8-bb45-2c1b377b3a2d\") " pod="openstack-operators/keystone-operator-controller-manager-c48d5fbd5-pt768" Oct 02 13:14:23 crc kubenswrapper[4724]: I1002 13:14:23.324227 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xk4bq\" (UniqueName: \"kubernetes.io/projected/b1ea0368-8cba-4ff8-bb45-2c1b377b3a2d-kube-api-access-xk4bq\") pod \"keystone-operator-controller-manager-c48d5fbd5-pt768\" (UID: \"b1ea0368-8cba-4ff8-bb45-2c1b377b3a2d\") " pod="openstack-operators/keystone-operator-controller-manager-c48d5fbd5-pt768" Oct 02 13:14:23 crc kubenswrapper[4724]: I1002 13:14:23.333251 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/b1ea0368-8cba-4ff8-bb45-2c1b377b3a2d-webhook-cert\") pod \"keystone-operator-controller-manager-c48d5fbd5-pt768\" (UID: \"b1ea0368-8cba-4ff8-bb45-2c1b377b3a2d\") " pod="openstack-operators/keystone-operator-controller-manager-c48d5fbd5-pt768" Oct 02 13:14:23 crc kubenswrapper[4724]: I1002 13:14:23.334342 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/b1ea0368-8cba-4ff8-bb45-2c1b377b3a2d-apiservice-cert\") pod \"keystone-operator-controller-manager-c48d5fbd5-pt768\" (UID: \"b1ea0368-8cba-4ff8-bb45-2c1b377b3a2d\") " pod="openstack-operators/keystone-operator-controller-manager-c48d5fbd5-pt768" Oct 02 13:14:23 crc kubenswrapper[4724]: I1002 13:14:23.341142 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xk4bq\" (UniqueName: \"kubernetes.io/projected/b1ea0368-8cba-4ff8-bb45-2c1b377b3a2d-kube-api-access-xk4bq\") pod \"keystone-operator-controller-manager-c48d5fbd5-pt768\" (UID: \"b1ea0368-8cba-4ff8-bb45-2c1b377b3a2d\") " pod="openstack-operators/keystone-operator-controller-manager-c48d5fbd5-pt768" Oct 02 13:14:23 crc kubenswrapper[4724]: I1002 13:14:23.435692 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-c48d5fbd5-pt768" Oct 02 13:14:23 crc kubenswrapper[4724]: I1002 13:14:23.867776 4724 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-c48d5fbd5-pt768"] Oct 02 13:14:24 crc kubenswrapper[4724]: I1002 13:14:24.838214 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-c48d5fbd5-pt768" event={"ID":"b1ea0368-8cba-4ff8-bb45-2c1b377b3a2d","Type":"ContainerStarted","Data":"d150b9f8488d27382957e517b4b719f26b8f2cced99d7caebcead8c13c9f65c9"} Oct 02 13:14:27 crc kubenswrapper[4724]: I1002 13:14:27.860073 4724 generic.go:334] "Generic (PLEG): container finished" podID="cfb3d3fc-ef69-4586-89af-7b9d221d61d7" containerID="94b79f26bc2c765fe3e961d98832dbe8c82eb23e1ed94f918673ac41f48bfefe" exitCode=0 Oct 02 13:14:27 crc kubenswrapper[4724]: I1002 13:14:27.860182 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/rabbitmq-server-0" event={"ID":"cfb3d3fc-ef69-4586-89af-7b9d221d61d7","Type":"ContainerDied","Data":"94b79f26bc2c765fe3e961d98832dbe8c82eb23e1ed94f918673ac41f48bfefe"} Oct 02 13:14:28 crc kubenswrapper[4724]: I1002 13:14:28.877709 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/rabbitmq-server-0" event={"ID":"cfb3d3fc-ef69-4586-89af-7b9d221d61d7","Type":"ContainerStarted","Data":"f9ff2673536858ac9a9a0e6002629271f1142113e563f8bcfba117ff7bb270cd"} Oct 02 13:14:28 crc kubenswrapper[4724]: I1002 13:14:28.880009 4724 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/rabbitmq-server-0" Oct 02 13:14:28 crc kubenswrapper[4724]: I1002 13:14:28.887008 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-c48d5fbd5-pt768" event={"ID":"b1ea0368-8cba-4ff8-bb45-2c1b377b3a2d","Type":"ContainerStarted","Data":"a82ffd1cad13e85a762e55b35cc3715e4a79d7f8f3115da5bccba505b7897610"} Oct 02 13:14:28 crc kubenswrapper[4724]: I1002 13:14:28.887056 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-c48d5fbd5-pt768" event={"ID":"b1ea0368-8cba-4ff8-bb45-2c1b377b3a2d","Type":"ContainerStarted","Data":"3883c5b8b687a7fd37d7b210d6e8dc6b1d3e58bd5c5f9c5216025c934810afe8"} Oct 02 13:14:28 crc kubenswrapper[4724]: I1002 13:14:28.887847 4724 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/keystone-operator-controller-manager-c48d5fbd5-pt768" Oct 02 13:14:28 crc kubenswrapper[4724]: I1002 13:14:28.909567 4724 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="glance-kuttl-tests/rabbitmq-server-0" podStartSLOduration=36.114165916 podStartE2EDuration="42.909524893s" podCreationTimestamp="2025-10-02 13:13:46 +0000 UTC" firstStartedPulling="2025-10-02 13:13:48.235403542 +0000 UTC m=+892.690162673" lastFinishedPulling="2025-10-02 13:13:55.030762529 +0000 UTC m=+899.485521650" observedRunningTime="2025-10-02 13:14:28.907396738 +0000 UTC m=+933.362155859" watchObservedRunningTime="2025-10-02 13:14:28.909524893 +0000 UTC m=+933.364284014" Oct 02 13:14:33 crc kubenswrapper[4724]: I1002 13:14:33.442338 4724 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/keystone-operator-controller-manager-c48d5fbd5-pt768" Oct 02 13:14:33 crc kubenswrapper[4724]: I1002 13:14:33.482660 4724 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/keystone-operator-controller-manager-c48d5fbd5-pt768" podStartSLOduration=6.370895793 podStartE2EDuration="10.482622484s" podCreationTimestamp="2025-10-02 13:14:23 +0000 UTC" firstStartedPulling="2025-10-02 13:14:23.879005586 +0000 UTC m=+928.333764707" lastFinishedPulling="2025-10-02 13:14:27.990732277 +0000 UTC m=+932.445491398" observedRunningTime="2025-10-02 13:14:28.935389375 +0000 UTC m=+933.390148496" watchObservedRunningTime="2025-10-02 13:14:33.482622484 +0000 UTC m=+937.937381645" Oct 02 13:14:34 crc kubenswrapper[4724]: I1002 13:14:34.734259 4724 patch_prober.go:28] interesting pod/machine-config-daemon-74k4t container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 02 13:14:34 crc kubenswrapper[4724]: I1002 13:14:34.734973 4724 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-74k4t" podUID="f6090eaa-c182-4788-950c-16352c271233" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 02 13:14:39 crc kubenswrapper[4724]: I1002 13:14:39.445654 4724 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/horizon-operator-index-qskvr"] Oct 02 13:14:39 crc kubenswrapper[4724]: I1002 13:14:39.447637 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-index-qskvr" Oct 02 13:14:39 crc kubenswrapper[4724]: I1002 13:14:39.450686 4724 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"horizon-operator-index-dockercfg-6n6d9" Oct 02 13:14:39 crc kubenswrapper[4724]: I1002 13:14:39.463506 4724 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-index-qskvr"] Oct 02 13:14:39 crc kubenswrapper[4724]: I1002 13:14:39.573253 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dcfdd\" (UniqueName: \"kubernetes.io/projected/cf9c3c91-56de-4422-adef-f607d65be88b-kube-api-access-dcfdd\") pod \"horizon-operator-index-qskvr\" (UID: \"cf9c3c91-56de-4422-adef-f607d65be88b\") " pod="openstack-operators/horizon-operator-index-qskvr" Oct 02 13:14:39 crc kubenswrapper[4724]: I1002 13:14:39.675062 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dcfdd\" (UniqueName: \"kubernetes.io/projected/cf9c3c91-56de-4422-adef-f607d65be88b-kube-api-access-dcfdd\") pod \"horizon-operator-index-qskvr\" (UID: \"cf9c3c91-56de-4422-adef-f607d65be88b\") " pod="openstack-operators/horizon-operator-index-qskvr" Oct 02 13:14:39 crc kubenswrapper[4724]: I1002 13:14:39.696133 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dcfdd\" (UniqueName: \"kubernetes.io/projected/cf9c3c91-56de-4422-adef-f607d65be88b-kube-api-access-dcfdd\") pod \"horizon-operator-index-qskvr\" (UID: \"cf9c3c91-56de-4422-adef-f607d65be88b\") " pod="openstack-operators/horizon-operator-index-qskvr" Oct 02 13:14:39 crc kubenswrapper[4724]: I1002 13:14:39.767162 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-index-qskvr" Oct 02 13:14:40 crc kubenswrapper[4724]: I1002 13:14:40.240918 4724 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-index-qskvr"] Oct 02 13:14:40 crc kubenswrapper[4724]: I1002 13:14:40.970826 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-index-qskvr" event={"ID":"cf9c3c91-56de-4422-adef-f607d65be88b","Type":"ContainerStarted","Data":"03393ba194822835ee170b1179e5562d4a7b6ee5dfdef99485bba589eca5b843"} Oct 02 13:14:42 crc kubenswrapper[4724]: I1002 13:14:42.851253 4724 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/swift-operator-index-6wl8t"] Oct 02 13:14:42 crc kubenswrapper[4724]: I1002 13:14:42.853493 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-index-6wl8t" Oct 02 13:14:42 crc kubenswrapper[4724]: I1002 13:14:42.858156 4724 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"swift-operator-index-dockercfg-pfmrc" Oct 02 13:14:42 crc kubenswrapper[4724]: I1002 13:14:42.868473 4724 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-index-6wl8t"] Oct 02 13:14:42 crc kubenswrapper[4724]: I1002 13:14:42.928356 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wzgk8\" (UniqueName: \"kubernetes.io/projected/bea86036-f472-4a34-bc7a-dc5ccfa9dc77-kube-api-access-wzgk8\") pod \"swift-operator-index-6wl8t\" (UID: \"bea86036-f472-4a34-bc7a-dc5ccfa9dc77\") " pod="openstack-operators/swift-operator-index-6wl8t" Oct 02 13:14:43 crc kubenswrapper[4724]: I1002 13:14:43.030071 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wzgk8\" (UniqueName: \"kubernetes.io/projected/bea86036-f472-4a34-bc7a-dc5ccfa9dc77-kube-api-access-wzgk8\") pod \"swift-operator-index-6wl8t\" (UID: \"bea86036-f472-4a34-bc7a-dc5ccfa9dc77\") " pod="openstack-operators/swift-operator-index-6wl8t" Oct 02 13:14:43 crc kubenswrapper[4724]: I1002 13:14:43.064944 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wzgk8\" (UniqueName: \"kubernetes.io/projected/bea86036-f472-4a34-bc7a-dc5ccfa9dc77-kube-api-access-wzgk8\") pod \"swift-operator-index-6wl8t\" (UID: \"bea86036-f472-4a34-bc7a-dc5ccfa9dc77\") " pod="openstack-operators/swift-operator-index-6wl8t" Oct 02 13:14:43 crc kubenswrapper[4724]: I1002 13:14:43.186147 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-index-6wl8t" Oct 02 13:14:43 crc kubenswrapper[4724]: I1002 13:14:43.409355 4724 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-index-6wl8t"] Oct 02 13:14:43 crc kubenswrapper[4724]: W1002 13:14:43.413103 4724 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbea86036_f472_4a34_bc7a_dc5ccfa9dc77.slice/crio-784deb4c42ebea2ca207438afcbbacf852c66f04959be280ba874f6f1f53cad0 WatchSource:0}: Error finding container 784deb4c42ebea2ca207438afcbbacf852c66f04959be280ba874f6f1f53cad0: Status 404 returned error can't find the container with id 784deb4c42ebea2ca207438afcbbacf852c66f04959be280ba874f6f1f53cad0 Oct 02 13:14:43 crc kubenswrapper[4724]: I1002 13:14:43.996586 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-index-qskvr" event={"ID":"cf9c3c91-56de-4422-adef-f607d65be88b","Type":"ContainerStarted","Data":"1d06501a8fdbab9b2691ac2d2c9b96eb6b11d1e497d6fc805004cc2acb875582"} Oct 02 13:14:43 crc kubenswrapper[4724]: I1002 13:14:43.998880 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-index-6wl8t" event={"ID":"bea86036-f472-4a34-bc7a-dc5ccfa9dc77","Type":"ContainerStarted","Data":"784deb4c42ebea2ca207438afcbbacf852c66f04959be280ba874f6f1f53cad0"} Oct 02 13:14:44 crc kubenswrapper[4724]: I1002 13:14:44.026638 4724 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/horizon-operator-index-qskvr" podStartSLOduration=2.31916708 podStartE2EDuration="5.026595967s" podCreationTimestamp="2025-10-02 13:14:39 +0000 UTC" firstStartedPulling="2025-10-02 13:14:40.254428974 +0000 UTC m=+944.709188095" lastFinishedPulling="2025-10-02 13:14:42.961857861 +0000 UTC m=+947.416616982" observedRunningTime="2025-10-02 13:14:44.019415011 +0000 UTC m=+948.474174202" watchObservedRunningTime="2025-10-02 13:14:44.026595967 +0000 UTC m=+948.481355138" Oct 02 13:14:44 crc kubenswrapper[4724]: I1002 13:14:44.838250 4724 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/horizon-operator-index-qskvr"] Oct 02 13:14:45 crc kubenswrapper[4724]: I1002 13:14:45.023771 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-index-6wl8t" event={"ID":"bea86036-f472-4a34-bc7a-dc5ccfa9dc77","Type":"ContainerStarted","Data":"8a506ed44bf566fb22d360a8581ba4814f325f8c2cd5fe31fbef8e190040a6f9"} Oct 02 13:14:45 crc kubenswrapper[4724]: I1002 13:14:45.050626 4724 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/swift-operator-index-6wl8t" podStartSLOduration=2.233853909 podStartE2EDuration="3.050598776s" podCreationTimestamp="2025-10-02 13:14:42 +0000 UTC" firstStartedPulling="2025-10-02 13:14:43.418110388 +0000 UTC m=+947.872869519" lastFinishedPulling="2025-10-02 13:14:44.234855265 +0000 UTC m=+948.689614386" observedRunningTime="2025-10-02 13:14:45.040180545 +0000 UTC m=+949.494939666" watchObservedRunningTime="2025-10-02 13:14:45.050598776 +0000 UTC m=+949.505357897" Oct 02 13:14:45 crc kubenswrapper[4724]: I1002 13:14:45.646286 4724 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/horizon-operator-index-v5l8v"] Oct 02 13:14:45 crc kubenswrapper[4724]: I1002 13:14:45.647127 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-index-v5l8v" Oct 02 13:14:45 crc kubenswrapper[4724]: I1002 13:14:45.656445 4724 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-index-v5l8v"] Oct 02 13:14:45 crc kubenswrapper[4724]: I1002 13:14:45.779946 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fftlc\" (UniqueName: \"kubernetes.io/projected/7f4f2819-785c-4e86-9ed7-21e0b606a214-kube-api-access-fftlc\") pod \"horizon-operator-index-v5l8v\" (UID: \"7f4f2819-785c-4e86-9ed7-21e0b606a214\") " pod="openstack-operators/horizon-operator-index-v5l8v" Oct 02 13:14:45 crc kubenswrapper[4724]: I1002 13:14:45.881889 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fftlc\" (UniqueName: \"kubernetes.io/projected/7f4f2819-785c-4e86-9ed7-21e0b606a214-kube-api-access-fftlc\") pod \"horizon-operator-index-v5l8v\" (UID: \"7f4f2819-785c-4e86-9ed7-21e0b606a214\") " pod="openstack-operators/horizon-operator-index-v5l8v" Oct 02 13:14:45 crc kubenswrapper[4724]: I1002 13:14:45.918728 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fftlc\" (UniqueName: \"kubernetes.io/projected/7f4f2819-785c-4e86-9ed7-21e0b606a214-kube-api-access-fftlc\") pod \"horizon-operator-index-v5l8v\" (UID: \"7f4f2819-785c-4e86-9ed7-21e0b606a214\") " pod="openstack-operators/horizon-operator-index-v5l8v" Oct 02 13:14:45 crc kubenswrapper[4724]: I1002 13:14:45.965626 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-index-v5l8v" Oct 02 13:14:46 crc kubenswrapper[4724]: I1002 13:14:46.030799 4724 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/horizon-operator-index-qskvr" podUID="cf9c3c91-56de-4422-adef-f607d65be88b" containerName="registry-server" containerID="cri-o://1d06501a8fdbab9b2691ac2d2c9b96eb6b11d1e497d6fc805004cc2acb875582" gracePeriod=2 Oct 02 13:14:46 crc kubenswrapper[4724]: I1002 13:14:46.488892 4724 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-index-qskvr" Oct 02 13:14:46 crc kubenswrapper[4724]: I1002 13:14:46.538613 4724 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-index-v5l8v"] Oct 02 13:14:46 crc kubenswrapper[4724]: W1002 13:14:46.546618 4724 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7f4f2819_785c_4e86_9ed7_21e0b606a214.slice/crio-f031cb96efc48bc21cb0e5dca0e07d38a28ef402fa7f3076eb9cb5323182d93a WatchSource:0}: Error finding container f031cb96efc48bc21cb0e5dca0e07d38a28ef402fa7f3076eb9cb5323182d93a: Status 404 returned error can't find the container with id f031cb96efc48bc21cb0e5dca0e07d38a28ef402fa7f3076eb9cb5323182d93a Oct 02 13:14:46 crc kubenswrapper[4724]: I1002 13:14:46.598368 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dcfdd\" (UniqueName: \"kubernetes.io/projected/cf9c3c91-56de-4422-adef-f607d65be88b-kube-api-access-dcfdd\") pod \"cf9c3c91-56de-4422-adef-f607d65be88b\" (UID: \"cf9c3c91-56de-4422-adef-f607d65be88b\") " Oct 02 13:14:46 crc kubenswrapper[4724]: I1002 13:14:46.605870 4724 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cf9c3c91-56de-4422-adef-f607d65be88b-kube-api-access-dcfdd" (OuterVolumeSpecName: "kube-api-access-dcfdd") pod "cf9c3c91-56de-4422-adef-f607d65be88b" (UID: "cf9c3c91-56de-4422-adef-f607d65be88b"). InnerVolumeSpecName "kube-api-access-dcfdd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 13:14:46 crc kubenswrapper[4724]: I1002 13:14:46.700382 4724 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dcfdd\" (UniqueName: \"kubernetes.io/projected/cf9c3c91-56de-4422-adef-f607d65be88b-kube-api-access-dcfdd\") on node \"crc\" DevicePath \"\"" Oct 02 13:14:46 crc kubenswrapper[4724]: I1002 13:14:46.950857 4724 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/keystone-db-create-wz42b"] Oct 02 13:14:46 crc kubenswrapper[4724]: E1002 13:14:46.951297 4724 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cf9c3c91-56de-4422-adef-f607d65be88b" containerName="registry-server" Oct 02 13:14:46 crc kubenswrapper[4724]: I1002 13:14:46.951342 4724 state_mem.go:107] "Deleted CPUSet assignment" podUID="cf9c3c91-56de-4422-adef-f607d65be88b" containerName="registry-server" Oct 02 13:14:46 crc kubenswrapper[4724]: I1002 13:14:46.951627 4724 memory_manager.go:354] "RemoveStaleState removing state" podUID="cf9c3c91-56de-4422-adef-f607d65be88b" containerName="registry-server" Oct 02 13:14:46 crc kubenswrapper[4724]: I1002 13:14:46.952333 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/keystone-db-create-wz42b" Oct 02 13:14:46 crc kubenswrapper[4724]: I1002 13:14:46.960599 4724 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/keystone-db-create-wz42b"] Oct 02 13:14:47 crc kubenswrapper[4724]: I1002 13:14:47.004926 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xplnc\" (UniqueName: \"kubernetes.io/projected/36c4f42f-bce9-409f-9cec-2f315dc87943-kube-api-access-xplnc\") pod \"keystone-db-create-wz42b\" (UID: \"36c4f42f-bce9-409f-9cec-2f315dc87943\") " pod="glance-kuttl-tests/keystone-db-create-wz42b" Oct 02 13:14:47 crc kubenswrapper[4724]: I1002 13:14:47.050716 4724 generic.go:334] "Generic (PLEG): container finished" podID="cf9c3c91-56de-4422-adef-f607d65be88b" containerID="1d06501a8fdbab9b2691ac2d2c9b96eb6b11d1e497d6fc805004cc2acb875582" exitCode=0 Oct 02 13:14:47 crc kubenswrapper[4724]: I1002 13:14:47.050951 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-index-qskvr" event={"ID":"cf9c3c91-56de-4422-adef-f607d65be88b","Type":"ContainerDied","Data":"1d06501a8fdbab9b2691ac2d2c9b96eb6b11d1e497d6fc805004cc2acb875582"} Oct 02 13:14:47 crc kubenswrapper[4724]: I1002 13:14:47.050986 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-index-qskvr" event={"ID":"cf9c3c91-56de-4422-adef-f607d65be88b","Type":"ContainerDied","Data":"03393ba194822835ee170b1179e5562d4a7b6ee5dfdef99485bba589eca5b843"} Oct 02 13:14:47 crc kubenswrapper[4724]: I1002 13:14:47.051002 4724 scope.go:117] "RemoveContainer" containerID="1d06501a8fdbab9b2691ac2d2c9b96eb6b11d1e497d6fc805004cc2acb875582" Oct 02 13:14:47 crc kubenswrapper[4724]: I1002 13:14:47.051329 4724 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-index-qskvr" Oct 02 13:14:47 crc kubenswrapper[4724]: I1002 13:14:47.053157 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-index-v5l8v" event={"ID":"7f4f2819-785c-4e86-9ed7-21e0b606a214","Type":"ContainerStarted","Data":"f031cb96efc48bc21cb0e5dca0e07d38a28ef402fa7f3076eb9cb5323182d93a"} Oct 02 13:14:47 crc kubenswrapper[4724]: I1002 13:14:47.071259 4724 scope.go:117] "RemoveContainer" containerID="1d06501a8fdbab9b2691ac2d2c9b96eb6b11d1e497d6fc805004cc2acb875582" Oct 02 13:14:47 crc kubenswrapper[4724]: E1002 13:14:47.072551 4724 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1d06501a8fdbab9b2691ac2d2c9b96eb6b11d1e497d6fc805004cc2acb875582\": container with ID starting with 1d06501a8fdbab9b2691ac2d2c9b96eb6b11d1e497d6fc805004cc2acb875582 not found: ID does not exist" containerID="1d06501a8fdbab9b2691ac2d2c9b96eb6b11d1e497d6fc805004cc2acb875582" Oct 02 13:14:47 crc kubenswrapper[4724]: I1002 13:14:47.072602 4724 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1d06501a8fdbab9b2691ac2d2c9b96eb6b11d1e497d6fc805004cc2acb875582"} err="failed to get container status \"1d06501a8fdbab9b2691ac2d2c9b96eb6b11d1e497d6fc805004cc2acb875582\": rpc error: code = NotFound desc = could not find container \"1d06501a8fdbab9b2691ac2d2c9b96eb6b11d1e497d6fc805004cc2acb875582\": container with ID starting with 1d06501a8fdbab9b2691ac2d2c9b96eb6b11d1e497d6fc805004cc2acb875582 not found: ID does not exist" Oct 02 13:14:47 crc kubenswrapper[4724]: I1002 13:14:47.080632 4724 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/horizon-operator-index-qskvr"] Oct 02 13:14:47 crc kubenswrapper[4724]: I1002 13:14:47.085906 4724 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/horizon-operator-index-qskvr"] Oct 02 13:14:47 crc kubenswrapper[4724]: I1002 13:14:47.106040 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xplnc\" (UniqueName: \"kubernetes.io/projected/36c4f42f-bce9-409f-9cec-2f315dc87943-kube-api-access-xplnc\") pod \"keystone-db-create-wz42b\" (UID: \"36c4f42f-bce9-409f-9cec-2f315dc87943\") " pod="glance-kuttl-tests/keystone-db-create-wz42b" Oct 02 13:14:47 crc kubenswrapper[4724]: I1002 13:14:47.127055 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xplnc\" (UniqueName: \"kubernetes.io/projected/36c4f42f-bce9-409f-9cec-2f315dc87943-kube-api-access-xplnc\") pod \"keystone-db-create-wz42b\" (UID: \"36c4f42f-bce9-409f-9cec-2f315dc87943\") " pod="glance-kuttl-tests/keystone-db-create-wz42b" Oct 02 13:14:47 crc kubenswrapper[4724]: I1002 13:14:47.277341 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/keystone-db-create-wz42b" Oct 02 13:14:47 crc kubenswrapper[4724]: I1002 13:14:47.737204 4724 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/keystone-db-create-wz42b"] Oct 02 13:14:47 crc kubenswrapper[4724]: I1002 13:14:47.973767 4724 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/rabbitmq-server-0" Oct 02 13:14:48 crc kubenswrapper[4724]: I1002 13:14:48.062023 4724 generic.go:334] "Generic (PLEG): container finished" podID="36c4f42f-bce9-409f-9cec-2f315dc87943" containerID="7eab315f1cda69b1bb2594ac4cdc239054b59199569e93f9475c0f12dc21df3b" exitCode=0 Oct 02 13:14:48 crc kubenswrapper[4724]: I1002 13:14:48.062098 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/keystone-db-create-wz42b" event={"ID":"36c4f42f-bce9-409f-9cec-2f315dc87943","Type":"ContainerDied","Data":"7eab315f1cda69b1bb2594ac4cdc239054b59199569e93f9475c0f12dc21df3b"} Oct 02 13:14:48 crc kubenswrapper[4724]: I1002 13:14:48.062452 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/keystone-db-create-wz42b" event={"ID":"36c4f42f-bce9-409f-9cec-2f315dc87943","Type":"ContainerStarted","Data":"ff7fa672efe0e0f003333aaff2d493ab6f2ebaffa3bc57ae3dffe05d9696bd29"} Oct 02 13:14:48 crc kubenswrapper[4724]: I1002 13:14:48.065597 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-index-v5l8v" event={"ID":"7f4f2819-785c-4e86-9ed7-21e0b606a214","Type":"ContainerStarted","Data":"375453f3f98041be0d4a4fea726db4cbb1b0b8bbae083024e592b726a9ad97ea"} Oct 02 13:14:48 crc kubenswrapper[4724]: I1002 13:14:48.103475 4724 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/horizon-operator-index-v5l8v" podStartSLOduration=2.56260891 podStartE2EDuration="3.103459493s" podCreationTimestamp="2025-10-02 13:14:45 +0000 UTC" firstStartedPulling="2025-10-02 13:14:46.550206273 +0000 UTC m=+951.004965394" lastFinishedPulling="2025-10-02 13:14:47.091056856 +0000 UTC m=+951.545815977" observedRunningTime="2025-10-02 13:14:48.101864962 +0000 UTC m=+952.556624113" watchObservedRunningTime="2025-10-02 13:14:48.103459493 +0000 UTC m=+952.558218614" Oct 02 13:14:48 crc kubenswrapper[4724]: I1002 13:14:48.323039 4724 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cf9c3c91-56de-4422-adef-f607d65be88b" path="/var/lib/kubelet/pods/cf9c3c91-56de-4422-adef-f607d65be88b/volumes" Oct 02 13:14:49 crc kubenswrapper[4724]: I1002 13:14:49.375435 4724 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/keystone-db-create-wz42b" Oct 02 13:14:49 crc kubenswrapper[4724]: I1002 13:14:49.443724 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xplnc\" (UniqueName: \"kubernetes.io/projected/36c4f42f-bce9-409f-9cec-2f315dc87943-kube-api-access-xplnc\") pod \"36c4f42f-bce9-409f-9cec-2f315dc87943\" (UID: \"36c4f42f-bce9-409f-9cec-2f315dc87943\") " Oct 02 13:14:49 crc kubenswrapper[4724]: I1002 13:14:49.450894 4724 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/36c4f42f-bce9-409f-9cec-2f315dc87943-kube-api-access-xplnc" (OuterVolumeSpecName: "kube-api-access-xplnc") pod "36c4f42f-bce9-409f-9cec-2f315dc87943" (UID: "36c4f42f-bce9-409f-9cec-2f315dc87943"). InnerVolumeSpecName "kube-api-access-xplnc". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 13:14:49 crc kubenswrapper[4724]: I1002 13:14:49.546312 4724 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xplnc\" (UniqueName: \"kubernetes.io/projected/36c4f42f-bce9-409f-9cec-2f315dc87943-kube-api-access-xplnc\") on node \"crc\" DevicePath \"\"" Oct 02 13:14:50 crc kubenswrapper[4724]: I1002 13:14:50.095291 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/keystone-db-create-wz42b" event={"ID":"36c4f42f-bce9-409f-9cec-2f315dc87943","Type":"ContainerDied","Data":"ff7fa672efe0e0f003333aaff2d493ab6f2ebaffa3bc57ae3dffe05d9696bd29"} Oct 02 13:14:50 crc kubenswrapper[4724]: I1002 13:14:50.095336 4724 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ff7fa672efe0e0f003333aaff2d493ab6f2ebaffa3bc57ae3dffe05d9696bd29" Oct 02 13:14:50 crc kubenswrapper[4724]: I1002 13:14:50.095375 4724 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/keystone-db-create-wz42b" Oct 02 13:14:53 crc kubenswrapper[4724]: I1002 13:14:53.186690 4724 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/swift-operator-index-6wl8t" Oct 02 13:14:53 crc kubenswrapper[4724]: I1002 13:14:53.187188 4724 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/swift-operator-index-6wl8t" Oct 02 13:14:53 crc kubenswrapper[4724]: I1002 13:14:53.222609 4724 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/swift-operator-index-6wl8t" Oct 02 13:14:54 crc kubenswrapper[4724]: I1002 13:14:54.152775 4724 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/swift-operator-index-6wl8t" Oct 02 13:14:55 crc kubenswrapper[4724]: I1002 13:14:55.966117 4724 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/horizon-operator-index-v5l8v" Oct 02 13:14:55 crc kubenswrapper[4724]: I1002 13:14:55.966181 4724 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/horizon-operator-index-v5l8v" Oct 02 13:14:55 crc kubenswrapper[4724]: I1002 13:14:55.994422 4724 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/horizon-operator-index-v5l8v" Oct 02 13:14:56 crc kubenswrapper[4724]: I1002 13:14:56.165160 4724 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/horizon-operator-index-v5l8v" Oct 02 13:14:56 crc kubenswrapper[4724]: I1002 13:14:56.848266 4724 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/keystone-4534-account-create-zpl5t"] Oct 02 13:14:56 crc kubenswrapper[4724]: E1002 13:14:56.849038 4724 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="36c4f42f-bce9-409f-9cec-2f315dc87943" containerName="mariadb-database-create" Oct 02 13:14:56 crc kubenswrapper[4724]: I1002 13:14:56.849066 4724 state_mem.go:107] "Deleted CPUSet assignment" podUID="36c4f42f-bce9-409f-9cec-2f315dc87943" containerName="mariadb-database-create" Oct 02 13:14:56 crc kubenswrapper[4724]: I1002 13:14:56.849211 4724 memory_manager.go:354] "RemoveStaleState removing state" podUID="36c4f42f-bce9-409f-9cec-2f315dc87943" containerName="mariadb-database-create" Oct 02 13:14:56 crc kubenswrapper[4724]: I1002 13:14:56.849974 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/keystone-4534-account-create-zpl5t" Oct 02 13:14:56 crc kubenswrapper[4724]: I1002 13:14:56.852722 4724 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"keystone-db-secret" Oct 02 13:14:56 crc kubenswrapper[4724]: I1002 13:14:56.857659 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5sr7h\" (UniqueName: \"kubernetes.io/projected/7c2ee54e-6822-42cb-940b-0734caf0ba50-kube-api-access-5sr7h\") pod \"keystone-4534-account-create-zpl5t\" (UID: \"7c2ee54e-6822-42cb-940b-0734caf0ba50\") " pod="glance-kuttl-tests/keystone-4534-account-create-zpl5t" Oct 02 13:14:56 crc kubenswrapper[4724]: I1002 13:14:56.859722 4724 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/keystone-4534-account-create-zpl5t"] Oct 02 13:14:56 crc kubenswrapper[4724]: I1002 13:14:56.959824 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5sr7h\" (UniqueName: \"kubernetes.io/projected/7c2ee54e-6822-42cb-940b-0734caf0ba50-kube-api-access-5sr7h\") pod \"keystone-4534-account-create-zpl5t\" (UID: \"7c2ee54e-6822-42cb-940b-0734caf0ba50\") " pod="glance-kuttl-tests/keystone-4534-account-create-zpl5t" Oct 02 13:14:56 crc kubenswrapper[4724]: I1002 13:14:56.980557 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5sr7h\" (UniqueName: \"kubernetes.io/projected/7c2ee54e-6822-42cb-940b-0734caf0ba50-kube-api-access-5sr7h\") pod \"keystone-4534-account-create-zpl5t\" (UID: \"7c2ee54e-6822-42cb-940b-0734caf0ba50\") " pod="glance-kuttl-tests/keystone-4534-account-create-zpl5t" Oct 02 13:14:57 crc kubenswrapper[4724]: I1002 13:14:57.170202 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/keystone-4534-account-create-zpl5t" Oct 02 13:14:57 crc kubenswrapper[4724]: I1002 13:14:57.656327 4724 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/keystone-4534-account-create-zpl5t"] Oct 02 13:14:57 crc kubenswrapper[4724]: W1002 13:14:57.673936 4724 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7c2ee54e_6822_42cb_940b_0734caf0ba50.slice/crio-9391dc27b6b4ec1b4401f1be973fb3b612ae5463832d726f2149a0c157bea1cd WatchSource:0}: Error finding container 9391dc27b6b4ec1b4401f1be973fb3b612ae5463832d726f2149a0c157bea1cd: Status 404 returned error can't find the container with id 9391dc27b6b4ec1b4401f1be973fb3b612ae5463832d726f2149a0c157bea1cd Oct 02 13:14:58 crc kubenswrapper[4724]: I1002 13:14:58.174293 4724 generic.go:334] "Generic (PLEG): container finished" podID="7c2ee54e-6822-42cb-940b-0734caf0ba50" containerID="7a27c7c57714900861ce163e3d00ce5b99fff0180b7d17ffd6eda52f9c62eba5" exitCode=0 Oct 02 13:14:58 crc kubenswrapper[4724]: I1002 13:14:58.174388 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/keystone-4534-account-create-zpl5t" event={"ID":"7c2ee54e-6822-42cb-940b-0734caf0ba50","Type":"ContainerDied","Data":"7a27c7c57714900861ce163e3d00ce5b99fff0180b7d17ffd6eda52f9c62eba5"} Oct 02 13:14:58 crc kubenswrapper[4724]: I1002 13:14:58.174770 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/keystone-4534-account-create-zpl5t" event={"ID":"7c2ee54e-6822-42cb-940b-0734caf0ba50","Type":"ContainerStarted","Data":"9391dc27b6b4ec1b4401f1be973fb3b612ae5463832d726f2149a0c157bea1cd"} Oct 02 13:14:59 crc kubenswrapper[4724]: I1002 13:14:59.508883 4724 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/keystone-4534-account-create-zpl5t" Oct 02 13:14:59 crc kubenswrapper[4724]: I1002 13:14:59.598614 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5sr7h\" (UniqueName: \"kubernetes.io/projected/7c2ee54e-6822-42cb-940b-0734caf0ba50-kube-api-access-5sr7h\") pod \"7c2ee54e-6822-42cb-940b-0734caf0ba50\" (UID: \"7c2ee54e-6822-42cb-940b-0734caf0ba50\") " Oct 02 13:14:59 crc kubenswrapper[4724]: I1002 13:14:59.606452 4724 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7c2ee54e-6822-42cb-940b-0734caf0ba50-kube-api-access-5sr7h" (OuterVolumeSpecName: "kube-api-access-5sr7h") pod "7c2ee54e-6822-42cb-940b-0734caf0ba50" (UID: "7c2ee54e-6822-42cb-940b-0734caf0ba50"). InnerVolumeSpecName "kube-api-access-5sr7h". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 13:14:59 crc kubenswrapper[4724]: I1002 13:14:59.699858 4724 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5sr7h\" (UniqueName: \"kubernetes.io/projected/7c2ee54e-6822-42cb-940b-0734caf0ba50-kube-api-access-5sr7h\") on node \"crc\" DevicePath \"\"" Oct 02 13:15:00 crc kubenswrapper[4724]: I1002 13:15:00.130341 4724 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29323515-7vcgw"] Oct 02 13:15:00 crc kubenswrapper[4724]: E1002 13:15:00.130659 4724 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7c2ee54e-6822-42cb-940b-0734caf0ba50" containerName="mariadb-account-create" Oct 02 13:15:00 crc kubenswrapper[4724]: I1002 13:15:00.130674 4724 state_mem.go:107] "Deleted CPUSet assignment" podUID="7c2ee54e-6822-42cb-940b-0734caf0ba50" containerName="mariadb-account-create" Oct 02 13:15:00 crc kubenswrapper[4724]: I1002 13:15:00.130834 4724 memory_manager.go:354] "RemoveStaleState removing state" podUID="7c2ee54e-6822-42cb-940b-0734caf0ba50" containerName="mariadb-account-create" Oct 02 13:15:00 crc kubenswrapper[4724]: I1002 13:15:00.131313 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29323515-7vcgw" Oct 02 13:15:00 crc kubenswrapper[4724]: I1002 13:15:00.133622 4724 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Oct 02 13:15:00 crc kubenswrapper[4724]: I1002 13:15:00.134340 4724 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Oct 02 13:15:00 crc kubenswrapper[4724]: I1002 13:15:00.148287 4724 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29323515-7vcgw"] Oct 02 13:15:00 crc kubenswrapper[4724]: I1002 13:15:00.193618 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/keystone-4534-account-create-zpl5t" event={"ID":"7c2ee54e-6822-42cb-940b-0734caf0ba50","Type":"ContainerDied","Data":"9391dc27b6b4ec1b4401f1be973fb3b612ae5463832d726f2149a0c157bea1cd"} Oct 02 13:15:00 crc kubenswrapper[4724]: I1002 13:15:00.193663 4724 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9391dc27b6b4ec1b4401f1be973fb3b612ae5463832d726f2149a0c157bea1cd" Oct 02 13:15:00 crc kubenswrapper[4724]: I1002 13:15:00.193727 4724 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/keystone-4534-account-create-zpl5t" Oct 02 13:15:00 crc kubenswrapper[4724]: I1002 13:15:00.206334 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/cfa0cbe9-5605-4e80-b8a9-b5f1ab7c1d26-secret-volume\") pod \"collect-profiles-29323515-7vcgw\" (UID: \"cfa0cbe9-5605-4e80-b8a9-b5f1ab7c1d26\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323515-7vcgw" Oct 02 13:15:00 crc kubenswrapper[4724]: I1002 13:15:00.206480 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/cfa0cbe9-5605-4e80-b8a9-b5f1ab7c1d26-config-volume\") pod \"collect-profiles-29323515-7vcgw\" (UID: \"cfa0cbe9-5605-4e80-b8a9-b5f1ab7c1d26\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323515-7vcgw" Oct 02 13:15:00 crc kubenswrapper[4724]: I1002 13:15:00.206601 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vdd7k\" (UniqueName: \"kubernetes.io/projected/cfa0cbe9-5605-4e80-b8a9-b5f1ab7c1d26-kube-api-access-vdd7k\") pod \"collect-profiles-29323515-7vcgw\" (UID: \"cfa0cbe9-5605-4e80-b8a9-b5f1ab7c1d26\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323515-7vcgw" Oct 02 13:15:00 crc kubenswrapper[4724]: I1002 13:15:00.313206 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vdd7k\" (UniqueName: \"kubernetes.io/projected/cfa0cbe9-5605-4e80-b8a9-b5f1ab7c1d26-kube-api-access-vdd7k\") pod \"collect-profiles-29323515-7vcgw\" (UID: \"cfa0cbe9-5605-4e80-b8a9-b5f1ab7c1d26\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323515-7vcgw" Oct 02 13:15:00 crc kubenswrapper[4724]: I1002 13:15:00.313320 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/cfa0cbe9-5605-4e80-b8a9-b5f1ab7c1d26-secret-volume\") pod \"collect-profiles-29323515-7vcgw\" (UID: \"cfa0cbe9-5605-4e80-b8a9-b5f1ab7c1d26\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323515-7vcgw" Oct 02 13:15:00 crc kubenswrapper[4724]: I1002 13:15:00.313487 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/cfa0cbe9-5605-4e80-b8a9-b5f1ab7c1d26-config-volume\") pod \"collect-profiles-29323515-7vcgw\" (UID: \"cfa0cbe9-5605-4e80-b8a9-b5f1ab7c1d26\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323515-7vcgw" Oct 02 13:15:00 crc kubenswrapper[4724]: I1002 13:15:00.314792 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/cfa0cbe9-5605-4e80-b8a9-b5f1ab7c1d26-config-volume\") pod \"collect-profiles-29323515-7vcgw\" (UID: \"cfa0cbe9-5605-4e80-b8a9-b5f1ab7c1d26\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323515-7vcgw" Oct 02 13:15:00 crc kubenswrapper[4724]: I1002 13:15:00.326786 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/cfa0cbe9-5605-4e80-b8a9-b5f1ab7c1d26-secret-volume\") pod \"collect-profiles-29323515-7vcgw\" (UID: \"cfa0cbe9-5605-4e80-b8a9-b5f1ab7c1d26\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323515-7vcgw" Oct 02 13:15:00 crc kubenswrapper[4724]: I1002 13:15:00.334354 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vdd7k\" (UniqueName: \"kubernetes.io/projected/cfa0cbe9-5605-4e80-b8a9-b5f1ab7c1d26-kube-api-access-vdd7k\") pod \"collect-profiles-29323515-7vcgw\" (UID: \"cfa0cbe9-5605-4e80-b8a9-b5f1ab7c1d26\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323515-7vcgw" Oct 02 13:15:00 crc kubenswrapper[4724]: I1002 13:15:00.464097 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29323515-7vcgw" Oct 02 13:15:00 crc kubenswrapper[4724]: I1002 13:15:00.882271 4724 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29323515-7vcgw"] Oct 02 13:15:01 crc kubenswrapper[4724]: I1002 13:15:01.203749 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29323515-7vcgw" event={"ID":"cfa0cbe9-5605-4e80-b8a9-b5f1ab7c1d26","Type":"ContainerStarted","Data":"fbd7304c946478d22572b3c609526fc54e9264f0f0bb69328026366c94b0ec4c"} Oct 02 13:15:01 crc kubenswrapper[4724]: I1002 13:15:01.204255 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29323515-7vcgw" event={"ID":"cfa0cbe9-5605-4e80-b8a9-b5f1ab7c1d26","Type":"ContainerStarted","Data":"2c3bf5e0a1aebf1c8e6c280aa8dfbea13adac625229b03936fbca7d99f2076fa"} Oct 02 13:15:01 crc kubenswrapper[4724]: I1002 13:15:01.224039 4724 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29323515-7vcgw" podStartSLOduration=1.224023469 podStartE2EDuration="1.224023469s" podCreationTimestamp="2025-10-02 13:15:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 13:15:01.223989248 +0000 UTC m=+965.678748369" watchObservedRunningTime="2025-10-02 13:15:01.224023469 +0000 UTC m=+965.678782590" Oct 02 13:15:02 crc kubenswrapper[4724]: I1002 13:15:02.210827 4724 generic.go:334] "Generic (PLEG): container finished" podID="cfa0cbe9-5605-4e80-b8a9-b5f1ab7c1d26" containerID="fbd7304c946478d22572b3c609526fc54e9264f0f0bb69328026366c94b0ec4c" exitCode=0 Oct 02 13:15:02 crc kubenswrapper[4724]: I1002 13:15:02.210882 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29323515-7vcgw" event={"ID":"cfa0cbe9-5605-4e80-b8a9-b5f1ab7c1d26","Type":"ContainerDied","Data":"fbd7304c946478d22572b3c609526fc54e9264f0f0bb69328026366c94b0ec4c"} Oct 02 13:15:02 crc kubenswrapper[4724]: I1002 13:15:02.308859 4724 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/keystone-db-sync-g8m6n"] Oct 02 13:15:02 crc kubenswrapper[4724]: I1002 13:15:02.310646 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/keystone-db-sync-g8m6n" Oct 02 13:15:02 crc kubenswrapper[4724]: I1002 13:15:02.313914 4724 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"keystone" Oct 02 13:15:02 crc kubenswrapper[4724]: I1002 13:15:02.314084 4724 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"keystone-scripts" Oct 02 13:15:02 crc kubenswrapper[4724]: I1002 13:15:02.314352 4724 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"keystone-keystone-dockercfg-sgttv" Oct 02 13:15:02 crc kubenswrapper[4724]: I1002 13:15:02.315712 4724 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"keystone-config-data" Oct 02 13:15:02 crc kubenswrapper[4724]: I1002 13:15:02.321210 4724 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/keystone-db-sync-g8m6n"] Oct 02 13:15:02 crc kubenswrapper[4724]: I1002 13:15:02.343455 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6ed4b4c5-14ff-4cf7-bb9f-99d9e0326dcc-config-data\") pod \"keystone-db-sync-g8m6n\" (UID: \"6ed4b4c5-14ff-4cf7-bb9f-99d9e0326dcc\") " pod="glance-kuttl-tests/keystone-db-sync-g8m6n" Oct 02 13:15:02 crc kubenswrapper[4724]: I1002 13:15:02.344780 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zmbml\" (UniqueName: \"kubernetes.io/projected/6ed4b4c5-14ff-4cf7-bb9f-99d9e0326dcc-kube-api-access-zmbml\") pod \"keystone-db-sync-g8m6n\" (UID: \"6ed4b4c5-14ff-4cf7-bb9f-99d9e0326dcc\") " pod="glance-kuttl-tests/keystone-db-sync-g8m6n" Oct 02 13:15:02 crc kubenswrapper[4724]: I1002 13:15:02.445957 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zmbml\" (UniqueName: \"kubernetes.io/projected/6ed4b4c5-14ff-4cf7-bb9f-99d9e0326dcc-kube-api-access-zmbml\") pod \"keystone-db-sync-g8m6n\" (UID: \"6ed4b4c5-14ff-4cf7-bb9f-99d9e0326dcc\") " pod="glance-kuttl-tests/keystone-db-sync-g8m6n" Oct 02 13:15:02 crc kubenswrapper[4724]: I1002 13:15:02.446044 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6ed4b4c5-14ff-4cf7-bb9f-99d9e0326dcc-config-data\") pod \"keystone-db-sync-g8m6n\" (UID: \"6ed4b4c5-14ff-4cf7-bb9f-99d9e0326dcc\") " pod="glance-kuttl-tests/keystone-db-sync-g8m6n" Oct 02 13:15:02 crc kubenswrapper[4724]: I1002 13:15:02.459743 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6ed4b4c5-14ff-4cf7-bb9f-99d9e0326dcc-config-data\") pod \"keystone-db-sync-g8m6n\" (UID: \"6ed4b4c5-14ff-4cf7-bb9f-99d9e0326dcc\") " pod="glance-kuttl-tests/keystone-db-sync-g8m6n" Oct 02 13:15:02 crc kubenswrapper[4724]: I1002 13:15:02.478494 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zmbml\" (UniqueName: \"kubernetes.io/projected/6ed4b4c5-14ff-4cf7-bb9f-99d9e0326dcc-kube-api-access-zmbml\") pod \"keystone-db-sync-g8m6n\" (UID: \"6ed4b4c5-14ff-4cf7-bb9f-99d9e0326dcc\") " pod="glance-kuttl-tests/keystone-db-sync-g8m6n" Oct 02 13:15:02 crc kubenswrapper[4724]: I1002 13:15:02.626970 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/keystone-db-sync-g8m6n" Oct 02 13:15:03 crc kubenswrapper[4724]: I1002 13:15:03.161741 4724 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/keystone-db-sync-g8m6n"] Oct 02 13:15:03 crc kubenswrapper[4724]: I1002 13:15:03.218418 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/keystone-db-sync-g8m6n" event={"ID":"6ed4b4c5-14ff-4cf7-bb9f-99d9e0326dcc","Type":"ContainerStarted","Data":"872ccd8954c8719b13680745555e0bc96f242a0d0cdc1d9355fa55ed96c7db0a"} Oct 02 13:15:03 crc kubenswrapper[4724]: I1002 13:15:03.449666 4724 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29323515-7vcgw" Oct 02 13:15:03 crc kubenswrapper[4724]: I1002 13:15:03.568305 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/cfa0cbe9-5605-4e80-b8a9-b5f1ab7c1d26-secret-volume\") pod \"cfa0cbe9-5605-4e80-b8a9-b5f1ab7c1d26\" (UID: \"cfa0cbe9-5605-4e80-b8a9-b5f1ab7c1d26\") " Oct 02 13:15:03 crc kubenswrapper[4724]: I1002 13:15:03.568663 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/cfa0cbe9-5605-4e80-b8a9-b5f1ab7c1d26-config-volume\") pod \"cfa0cbe9-5605-4e80-b8a9-b5f1ab7c1d26\" (UID: \"cfa0cbe9-5605-4e80-b8a9-b5f1ab7c1d26\") " Oct 02 13:15:03 crc kubenswrapper[4724]: I1002 13:15:03.568708 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vdd7k\" (UniqueName: \"kubernetes.io/projected/cfa0cbe9-5605-4e80-b8a9-b5f1ab7c1d26-kube-api-access-vdd7k\") pod \"cfa0cbe9-5605-4e80-b8a9-b5f1ab7c1d26\" (UID: \"cfa0cbe9-5605-4e80-b8a9-b5f1ab7c1d26\") " Oct 02 13:15:03 crc kubenswrapper[4724]: I1002 13:15:03.569976 4724 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cfa0cbe9-5605-4e80-b8a9-b5f1ab7c1d26-config-volume" (OuterVolumeSpecName: "config-volume") pod "cfa0cbe9-5605-4e80-b8a9-b5f1ab7c1d26" (UID: "cfa0cbe9-5605-4e80-b8a9-b5f1ab7c1d26"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 13:15:03 crc kubenswrapper[4724]: I1002 13:15:03.575782 4724 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cfa0cbe9-5605-4e80-b8a9-b5f1ab7c1d26-kube-api-access-vdd7k" (OuterVolumeSpecName: "kube-api-access-vdd7k") pod "cfa0cbe9-5605-4e80-b8a9-b5f1ab7c1d26" (UID: "cfa0cbe9-5605-4e80-b8a9-b5f1ab7c1d26"). InnerVolumeSpecName "kube-api-access-vdd7k". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 13:15:03 crc kubenswrapper[4724]: I1002 13:15:03.578110 4724 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cfa0cbe9-5605-4e80-b8a9-b5f1ab7c1d26-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "cfa0cbe9-5605-4e80-b8a9-b5f1ab7c1d26" (UID: "cfa0cbe9-5605-4e80-b8a9-b5f1ab7c1d26"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 13:15:03 crc kubenswrapper[4724]: I1002 13:15:03.670889 4724 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/cfa0cbe9-5605-4e80-b8a9-b5f1ab7c1d26-config-volume\") on node \"crc\" DevicePath \"\"" Oct 02 13:15:03 crc kubenswrapper[4724]: I1002 13:15:03.670938 4724 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vdd7k\" (UniqueName: \"kubernetes.io/projected/cfa0cbe9-5605-4e80-b8a9-b5f1ab7c1d26-kube-api-access-vdd7k\") on node \"crc\" DevicePath \"\"" Oct 02 13:15:03 crc kubenswrapper[4724]: I1002 13:15:03.670950 4724 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/cfa0cbe9-5605-4e80-b8a9-b5f1ab7c1d26-secret-volume\") on node \"crc\" DevicePath \"\"" Oct 02 13:15:04 crc kubenswrapper[4724]: I1002 13:15:04.230781 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29323515-7vcgw" event={"ID":"cfa0cbe9-5605-4e80-b8a9-b5f1ab7c1d26","Type":"ContainerDied","Data":"2c3bf5e0a1aebf1c8e6c280aa8dfbea13adac625229b03936fbca7d99f2076fa"} Oct 02 13:15:04 crc kubenswrapper[4724]: I1002 13:15:04.231307 4724 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2c3bf5e0a1aebf1c8e6c280aa8dfbea13adac625229b03936fbca7d99f2076fa" Oct 02 13:15:04 crc kubenswrapper[4724]: I1002 13:15:04.230799 4724 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29323515-7vcgw" Oct 02 13:15:04 crc kubenswrapper[4724]: I1002 13:15:04.735309 4724 patch_prober.go:28] interesting pod/machine-config-daemon-74k4t container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 02 13:15:04 crc kubenswrapper[4724]: I1002 13:15:04.735402 4724 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-74k4t" podUID="f6090eaa-c182-4788-950c-16352c271233" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 02 13:15:10 crc kubenswrapper[4724]: I1002 13:15:10.276179 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/keystone-db-sync-g8m6n" event={"ID":"6ed4b4c5-14ff-4cf7-bb9f-99d9e0326dcc","Type":"ContainerStarted","Data":"82ebaf4fcfc4c9fdd8e051efb8e53b45a5c30f50b3a57245e4f64a180a5f7227"} Oct 02 13:15:10 crc kubenswrapper[4724]: I1002 13:15:10.296460 4724 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="glance-kuttl-tests/keystone-db-sync-g8m6n" podStartSLOduration=1.754802464 podStartE2EDuration="8.296440382s" podCreationTimestamp="2025-10-02 13:15:02 +0000 UTC" firstStartedPulling="2025-10-02 13:15:03.174273318 +0000 UTC m=+967.629032439" lastFinishedPulling="2025-10-02 13:15:09.715911236 +0000 UTC m=+974.170670357" observedRunningTime="2025-10-02 13:15:10.290913159 +0000 UTC m=+974.745672280" watchObservedRunningTime="2025-10-02 13:15:10.296440382 +0000 UTC m=+974.751199503" Oct 02 13:15:13 crc kubenswrapper[4724]: I1002 13:15:13.507330 4724 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/24f71a75347ad7a5f841d153f802ac72a45ac2356b85254a24ab5c9f58jvk6g"] Oct 02 13:15:13 crc kubenswrapper[4724]: E1002 13:15:13.508279 4724 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cfa0cbe9-5605-4e80-b8a9-b5f1ab7c1d26" containerName="collect-profiles" Oct 02 13:15:13 crc kubenswrapper[4724]: I1002 13:15:13.508310 4724 state_mem.go:107] "Deleted CPUSet assignment" podUID="cfa0cbe9-5605-4e80-b8a9-b5f1ab7c1d26" containerName="collect-profiles" Oct 02 13:15:13 crc kubenswrapper[4724]: I1002 13:15:13.508449 4724 memory_manager.go:354] "RemoveStaleState removing state" podUID="cfa0cbe9-5605-4e80-b8a9-b5f1ab7c1d26" containerName="collect-profiles" Oct 02 13:15:13 crc kubenswrapper[4724]: I1002 13:15:13.509568 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/24f71a75347ad7a5f841d153f802ac72a45ac2356b85254a24ab5c9f58jvk6g" Oct 02 13:15:13 crc kubenswrapper[4724]: I1002 13:15:13.512634 4724 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"default-dockercfg-j9mmh" Oct 02 13:15:13 crc kubenswrapper[4724]: I1002 13:15:13.523917 4724 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/24f71a75347ad7a5f841d153f802ac72a45ac2356b85254a24ab5c9f58jvk6g"] Oct 02 13:15:13 crc kubenswrapper[4724]: I1002 13:15:13.622659 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tfr4c\" (UniqueName: \"kubernetes.io/projected/0de57d48-755b-4e1c-a6f0-88e5cb02d827-kube-api-access-tfr4c\") pod \"24f71a75347ad7a5f841d153f802ac72a45ac2356b85254a24ab5c9f58jvk6g\" (UID: \"0de57d48-755b-4e1c-a6f0-88e5cb02d827\") " pod="openstack-operators/24f71a75347ad7a5f841d153f802ac72a45ac2356b85254a24ab5c9f58jvk6g" Oct 02 13:15:13 crc kubenswrapper[4724]: I1002 13:15:13.622759 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/0de57d48-755b-4e1c-a6f0-88e5cb02d827-bundle\") pod \"24f71a75347ad7a5f841d153f802ac72a45ac2356b85254a24ab5c9f58jvk6g\" (UID: \"0de57d48-755b-4e1c-a6f0-88e5cb02d827\") " pod="openstack-operators/24f71a75347ad7a5f841d153f802ac72a45ac2356b85254a24ab5c9f58jvk6g" Oct 02 13:15:13 crc kubenswrapper[4724]: I1002 13:15:13.622895 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/0de57d48-755b-4e1c-a6f0-88e5cb02d827-util\") pod \"24f71a75347ad7a5f841d153f802ac72a45ac2356b85254a24ab5c9f58jvk6g\" (UID: \"0de57d48-755b-4e1c-a6f0-88e5cb02d827\") " pod="openstack-operators/24f71a75347ad7a5f841d153f802ac72a45ac2356b85254a24ab5c9f58jvk6g" Oct 02 13:15:13 crc kubenswrapper[4724]: I1002 13:15:13.724207 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/0de57d48-755b-4e1c-a6f0-88e5cb02d827-bundle\") pod \"24f71a75347ad7a5f841d153f802ac72a45ac2356b85254a24ab5c9f58jvk6g\" (UID: \"0de57d48-755b-4e1c-a6f0-88e5cb02d827\") " pod="openstack-operators/24f71a75347ad7a5f841d153f802ac72a45ac2356b85254a24ab5c9f58jvk6g" Oct 02 13:15:13 crc kubenswrapper[4724]: I1002 13:15:13.724310 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/0de57d48-755b-4e1c-a6f0-88e5cb02d827-util\") pod \"24f71a75347ad7a5f841d153f802ac72a45ac2356b85254a24ab5c9f58jvk6g\" (UID: \"0de57d48-755b-4e1c-a6f0-88e5cb02d827\") " pod="openstack-operators/24f71a75347ad7a5f841d153f802ac72a45ac2356b85254a24ab5c9f58jvk6g" Oct 02 13:15:13 crc kubenswrapper[4724]: I1002 13:15:13.724356 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tfr4c\" (UniqueName: \"kubernetes.io/projected/0de57d48-755b-4e1c-a6f0-88e5cb02d827-kube-api-access-tfr4c\") pod \"24f71a75347ad7a5f841d153f802ac72a45ac2356b85254a24ab5c9f58jvk6g\" (UID: \"0de57d48-755b-4e1c-a6f0-88e5cb02d827\") " pod="openstack-operators/24f71a75347ad7a5f841d153f802ac72a45ac2356b85254a24ab5c9f58jvk6g" Oct 02 13:15:13 crc kubenswrapper[4724]: I1002 13:15:13.724926 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/0de57d48-755b-4e1c-a6f0-88e5cb02d827-util\") pod \"24f71a75347ad7a5f841d153f802ac72a45ac2356b85254a24ab5c9f58jvk6g\" (UID: \"0de57d48-755b-4e1c-a6f0-88e5cb02d827\") " pod="openstack-operators/24f71a75347ad7a5f841d153f802ac72a45ac2356b85254a24ab5c9f58jvk6g" Oct 02 13:15:13 crc kubenswrapper[4724]: I1002 13:15:13.724949 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/0de57d48-755b-4e1c-a6f0-88e5cb02d827-bundle\") pod \"24f71a75347ad7a5f841d153f802ac72a45ac2356b85254a24ab5c9f58jvk6g\" (UID: \"0de57d48-755b-4e1c-a6f0-88e5cb02d827\") " pod="openstack-operators/24f71a75347ad7a5f841d153f802ac72a45ac2356b85254a24ab5c9f58jvk6g" Oct 02 13:15:13 crc kubenswrapper[4724]: I1002 13:15:13.752017 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tfr4c\" (UniqueName: \"kubernetes.io/projected/0de57d48-755b-4e1c-a6f0-88e5cb02d827-kube-api-access-tfr4c\") pod \"24f71a75347ad7a5f841d153f802ac72a45ac2356b85254a24ab5c9f58jvk6g\" (UID: \"0de57d48-755b-4e1c-a6f0-88e5cb02d827\") " pod="openstack-operators/24f71a75347ad7a5f841d153f802ac72a45ac2356b85254a24ab5c9f58jvk6g" Oct 02 13:15:13 crc kubenswrapper[4724]: I1002 13:15:13.882228 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/24f71a75347ad7a5f841d153f802ac72a45ac2356b85254a24ab5c9f58jvk6g" Oct 02 13:15:14 crc kubenswrapper[4724]: I1002 13:15:14.288650 4724 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/c77910536a79801a83f49d4fd4581e5a2972791dfc31ed0ea9f0ffea32fsqpq"] Oct 02 13:15:14 crc kubenswrapper[4724]: I1002 13:15:14.290200 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/c77910536a79801a83f49d4fd4581e5a2972791dfc31ed0ea9f0ffea32fsqpq" Oct 02 13:15:14 crc kubenswrapper[4724]: I1002 13:15:14.310322 4724 generic.go:334] "Generic (PLEG): container finished" podID="6ed4b4c5-14ff-4cf7-bb9f-99d9e0326dcc" containerID="82ebaf4fcfc4c9fdd8e051efb8e53b45a5c30f50b3a57245e4f64a180a5f7227" exitCode=0 Oct 02 13:15:14 crc kubenswrapper[4724]: I1002 13:15:14.310875 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/keystone-db-sync-g8m6n" event={"ID":"6ed4b4c5-14ff-4cf7-bb9f-99d9e0326dcc","Type":"ContainerDied","Data":"82ebaf4fcfc4c9fdd8e051efb8e53b45a5c30f50b3a57245e4f64a180a5f7227"} Oct 02 13:15:14 crc kubenswrapper[4724]: I1002 13:15:14.312867 4724 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/24f71a75347ad7a5f841d153f802ac72a45ac2356b85254a24ab5c9f58jvk6g"] Oct 02 13:15:14 crc kubenswrapper[4724]: W1002 13:15:14.320443 4724 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0de57d48_755b_4e1c_a6f0_88e5cb02d827.slice/crio-d770a8c4362b870c5d6b2c64bfae0d9dd32ded766dcbf6cb83bc6454868da12c WatchSource:0}: Error finding container d770a8c4362b870c5d6b2c64bfae0d9dd32ded766dcbf6cb83bc6454868da12c: Status 404 returned error can't find the container with id d770a8c4362b870c5d6b2c64bfae0d9dd32ded766dcbf6cb83bc6454868da12c Oct 02 13:15:14 crc kubenswrapper[4724]: I1002 13:15:14.337229 4724 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/c77910536a79801a83f49d4fd4581e5a2972791dfc31ed0ea9f0ffea32fsqpq"] Oct 02 13:15:14 crc kubenswrapper[4724]: I1002 13:15:14.434511 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6km2j\" (UniqueName: \"kubernetes.io/projected/18d91c41-08cd-4174-b468-54e2142c767e-kube-api-access-6km2j\") pod \"c77910536a79801a83f49d4fd4581e5a2972791dfc31ed0ea9f0ffea32fsqpq\" (UID: \"18d91c41-08cd-4174-b468-54e2142c767e\") " pod="openstack-operators/c77910536a79801a83f49d4fd4581e5a2972791dfc31ed0ea9f0ffea32fsqpq" Oct 02 13:15:14 crc kubenswrapper[4724]: I1002 13:15:14.434585 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/18d91c41-08cd-4174-b468-54e2142c767e-bundle\") pod \"c77910536a79801a83f49d4fd4581e5a2972791dfc31ed0ea9f0ffea32fsqpq\" (UID: \"18d91c41-08cd-4174-b468-54e2142c767e\") " pod="openstack-operators/c77910536a79801a83f49d4fd4581e5a2972791dfc31ed0ea9f0ffea32fsqpq" Oct 02 13:15:14 crc kubenswrapper[4724]: I1002 13:15:14.434606 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/18d91c41-08cd-4174-b468-54e2142c767e-util\") pod \"c77910536a79801a83f49d4fd4581e5a2972791dfc31ed0ea9f0ffea32fsqpq\" (UID: \"18d91c41-08cd-4174-b468-54e2142c767e\") " pod="openstack-operators/c77910536a79801a83f49d4fd4581e5a2972791dfc31ed0ea9f0ffea32fsqpq" Oct 02 13:15:14 crc kubenswrapper[4724]: I1002 13:15:14.536496 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6km2j\" (UniqueName: \"kubernetes.io/projected/18d91c41-08cd-4174-b468-54e2142c767e-kube-api-access-6km2j\") pod \"c77910536a79801a83f49d4fd4581e5a2972791dfc31ed0ea9f0ffea32fsqpq\" (UID: \"18d91c41-08cd-4174-b468-54e2142c767e\") " pod="openstack-operators/c77910536a79801a83f49d4fd4581e5a2972791dfc31ed0ea9f0ffea32fsqpq" Oct 02 13:15:14 crc kubenswrapper[4724]: I1002 13:15:14.536582 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/18d91c41-08cd-4174-b468-54e2142c767e-bundle\") pod \"c77910536a79801a83f49d4fd4581e5a2972791dfc31ed0ea9f0ffea32fsqpq\" (UID: \"18d91c41-08cd-4174-b468-54e2142c767e\") " pod="openstack-operators/c77910536a79801a83f49d4fd4581e5a2972791dfc31ed0ea9f0ffea32fsqpq" Oct 02 13:15:14 crc kubenswrapper[4724]: I1002 13:15:14.536613 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/18d91c41-08cd-4174-b468-54e2142c767e-util\") pod \"c77910536a79801a83f49d4fd4581e5a2972791dfc31ed0ea9f0ffea32fsqpq\" (UID: \"18d91c41-08cd-4174-b468-54e2142c767e\") " pod="openstack-operators/c77910536a79801a83f49d4fd4581e5a2972791dfc31ed0ea9f0ffea32fsqpq" Oct 02 13:15:14 crc kubenswrapper[4724]: I1002 13:15:14.537278 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/18d91c41-08cd-4174-b468-54e2142c767e-util\") pod \"c77910536a79801a83f49d4fd4581e5a2972791dfc31ed0ea9f0ffea32fsqpq\" (UID: \"18d91c41-08cd-4174-b468-54e2142c767e\") " pod="openstack-operators/c77910536a79801a83f49d4fd4581e5a2972791dfc31ed0ea9f0ffea32fsqpq" Oct 02 13:15:14 crc kubenswrapper[4724]: I1002 13:15:14.537420 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/18d91c41-08cd-4174-b468-54e2142c767e-bundle\") pod \"c77910536a79801a83f49d4fd4581e5a2972791dfc31ed0ea9f0ffea32fsqpq\" (UID: \"18d91c41-08cd-4174-b468-54e2142c767e\") " pod="openstack-operators/c77910536a79801a83f49d4fd4581e5a2972791dfc31ed0ea9f0ffea32fsqpq" Oct 02 13:15:14 crc kubenswrapper[4724]: I1002 13:15:14.560524 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6km2j\" (UniqueName: \"kubernetes.io/projected/18d91c41-08cd-4174-b468-54e2142c767e-kube-api-access-6km2j\") pod \"c77910536a79801a83f49d4fd4581e5a2972791dfc31ed0ea9f0ffea32fsqpq\" (UID: \"18d91c41-08cd-4174-b468-54e2142c767e\") " pod="openstack-operators/c77910536a79801a83f49d4fd4581e5a2972791dfc31ed0ea9f0ffea32fsqpq" Oct 02 13:15:14 crc kubenswrapper[4724]: I1002 13:15:14.613505 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/c77910536a79801a83f49d4fd4581e5a2972791dfc31ed0ea9f0ffea32fsqpq" Oct 02 13:15:15 crc kubenswrapper[4724]: I1002 13:15:15.072244 4724 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/c77910536a79801a83f49d4fd4581e5a2972791dfc31ed0ea9f0ffea32fsqpq"] Oct 02 13:15:15 crc kubenswrapper[4724]: W1002 13:15:15.080135 4724 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod18d91c41_08cd_4174_b468_54e2142c767e.slice/crio-f978efec1788c6b09aecdef8fb7906ca657240e93ecb15cca8377fc54f8655d5 WatchSource:0}: Error finding container f978efec1788c6b09aecdef8fb7906ca657240e93ecb15cca8377fc54f8655d5: Status 404 returned error can't find the container with id f978efec1788c6b09aecdef8fb7906ca657240e93ecb15cca8377fc54f8655d5 Oct 02 13:15:15 crc kubenswrapper[4724]: I1002 13:15:15.328239 4724 generic.go:334] "Generic (PLEG): container finished" podID="0de57d48-755b-4e1c-a6f0-88e5cb02d827" containerID="b3aba7d34510b3b9f66e1656076a3825d64e4c26453111fbc08a42992311c9ad" exitCode=0 Oct 02 13:15:15 crc kubenswrapper[4724]: I1002 13:15:15.328344 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/24f71a75347ad7a5f841d153f802ac72a45ac2356b85254a24ab5c9f58jvk6g" event={"ID":"0de57d48-755b-4e1c-a6f0-88e5cb02d827","Type":"ContainerDied","Data":"b3aba7d34510b3b9f66e1656076a3825d64e4c26453111fbc08a42992311c9ad"} Oct 02 13:15:15 crc kubenswrapper[4724]: I1002 13:15:15.328382 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/24f71a75347ad7a5f841d153f802ac72a45ac2356b85254a24ab5c9f58jvk6g" event={"ID":"0de57d48-755b-4e1c-a6f0-88e5cb02d827","Type":"ContainerStarted","Data":"d770a8c4362b870c5d6b2c64bfae0d9dd32ded766dcbf6cb83bc6454868da12c"} Oct 02 13:15:15 crc kubenswrapper[4724]: I1002 13:15:15.330577 4724 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 02 13:15:15 crc kubenswrapper[4724]: I1002 13:15:15.332837 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/c77910536a79801a83f49d4fd4581e5a2972791dfc31ed0ea9f0ffea32fsqpq" event={"ID":"18d91c41-08cd-4174-b468-54e2142c767e","Type":"ContainerStarted","Data":"f978efec1788c6b09aecdef8fb7906ca657240e93ecb15cca8377fc54f8655d5"} Oct 02 13:15:15 crc kubenswrapper[4724]: I1002 13:15:15.612967 4724 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/keystone-db-sync-g8m6n" Oct 02 13:15:15 crc kubenswrapper[4724]: I1002 13:15:15.756592 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6ed4b4c5-14ff-4cf7-bb9f-99d9e0326dcc-config-data\") pod \"6ed4b4c5-14ff-4cf7-bb9f-99d9e0326dcc\" (UID: \"6ed4b4c5-14ff-4cf7-bb9f-99d9e0326dcc\") " Oct 02 13:15:15 crc kubenswrapper[4724]: I1002 13:15:15.756766 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zmbml\" (UniqueName: \"kubernetes.io/projected/6ed4b4c5-14ff-4cf7-bb9f-99d9e0326dcc-kube-api-access-zmbml\") pod \"6ed4b4c5-14ff-4cf7-bb9f-99d9e0326dcc\" (UID: \"6ed4b4c5-14ff-4cf7-bb9f-99d9e0326dcc\") " Oct 02 13:15:15 crc kubenswrapper[4724]: I1002 13:15:15.767122 4724 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ed4b4c5-14ff-4cf7-bb9f-99d9e0326dcc-kube-api-access-zmbml" (OuterVolumeSpecName: "kube-api-access-zmbml") pod "6ed4b4c5-14ff-4cf7-bb9f-99d9e0326dcc" (UID: "6ed4b4c5-14ff-4cf7-bb9f-99d9e0326dcc"). InnerVolumeSpecName "kube-api-access-zmbml". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 13:15:15 crc kubenswrapper[4724]: I1002 13:15:15.796758 4724 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ed4b4c5-14ff-4cf7-bb9f-99d9e0326dcc-config-data" (OuterVolumeSpecName: "config-data") pod "6ed4b4c5-14ff-4cf7-bb9f-99d9e0326dcc" (UID: "6ed4b4c5-14ff-4cf7-bb9f-99d9e0326dcc"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 13:15:15 crc kubenswrapper[4724]: I1002 13:15:15.859058 4724 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6ed4b4c5-14ff-4cf7-bb9f-99d9e0326dcc-config-data\") on node \"crc\" DevicePath \"\"" Oct 02 13:15:15 crc kubenswrapper[4724]: I1002 13:15:15.859137 4724 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zmbml\" (UniqueName: \"kubernetes.io/projected/6ed4b4c5-14ff-4cf7-bb9f-99d9e0326dcc-kube-api-access-zmbml\") on node \"crc\" DevicePath \"\"" Oct 02 13:15:16 crc kubenswrapper[4724]: I1002 13:15:16.353322 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/c77910536a79801a83f49d4fd4581e5a2972791dfc31ed0ea9f0ffea32fsqpq" event={"ID":"18d91c41-08cd-4174-b468-54e2142c767e","Type":"ContainerStarted","Data":"5e707d986d5795d0e750b5c67ef96e797474a383c5da530f9579fc26d6c514de"} Oct 02 13:15:16 crc kubenswrapper[4724]: I1002 13:15:16.364071 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/keystone-db-sync-g8m6n" event={"ID":"6ed4b4c5-14ff-4cf7-bb9f-99d9e0326dcc","Type":"ContainerDied","Data":"872ccd8954c8719b13680745555e0bc96f242a0d0cdc1d9355fa55ed96c7db0a"} Oct 02 13:15:16 crc kubenswrapper[4724]: I1002 13:15:16.364129 4724 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="872ccd8954c8719b13680745555e0bc96f242a0d0cdc1d9355fa55ed96c7db0a" Oct 02 13:15:16 crc kubenswrapper[4724]: I1002 13:15:16.364345 4724 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/keystone-db-sync-g8m6n" Oct 02 13:15:16 crc kubenswrapper[4724]: I1002 13:15:16.558260 4724 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/keystone-bootstrap-b4gt5"] Oct 02 13:15:16 crc kubenswrapper[4724]: E1002 13:15:16.567875 4724 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6ed4b4c5-14ff-4cf7-bb9f-99d9e0326dcc" containerName="keystone-db-sync" Oct 02 13:15:16 crc kubenswrapper[4724]: I1002 13:15:16.567931 4724 state_mem.go:107] "Deleted CPUSet assignment" podUID="6ed4b4c5-14ff-4cf7-bb9f-99d9e0326dcc" containerName="keystone-db-sync" Oct 02 13:15:16 crc kubenswrapper[4724]: I1002 13:15:16.568136 4724 memory_manager.go:354] "RemoveStaleState removing state" podUID="6ed4b4c5-14ff-4cf7-bb9f-99d9e0326dcc" containerName="keystone-db-sync" Oct 02 13:15:16 crc kubenswrapper[4724]: I1002 13:15:16.568780 4724 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/keystone-bootstrap-b4gt5"] Oct 02 13:15:16 crc kubenswrapper[4724]: I1002 13:15:16.568894 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/keystone-bootstrap-b4gt5" Oct 02 13:15:16 crc kubenswrapper[4724]: I1002 13:15:16.574019 4724 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"keystone-scripts" Oct 02 13:15:16 crc kubenswrapper[4724]: I1002 13:15:16.574203 4724 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"keystone" Oct 02 13:15:16 crc kubenswrapper[4724]: I1002 13:15:16.574411 4724 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"keystone-keystone-dockercfg-sgttv" Oct 02 13:15:16 crc kubenswrapper[4724]: I1002 13:15:16.581272 4724 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"keystone-config-data" Oct 02 13:15:16 crc kubenswrapper[4724]: I1002 13:15:16.673262 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/740c0d97-42e5-421d-9def-352ddefb326c-config-data\") pod \"keystone-bootstrap-b4gt5\" (UID: \"740c0d97-42e5-421d-9def-352ddefb326c\") " pod="glance-kuttl-tests/keystone-bootstrap-b4gt5" Oct 02 13:15:16 crc kubenswrapper[4724]: I1002 13:15:16.673883 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jr97h\" (UniqueName: \"kubernetes.io/projected/740c0d97-42e5-421d-9def-352ddefb326c-kube-api-access-jr97h\") pod \"keystone-bootstrap-b4gt5\" (UID: \"740c0d97-42e5-421d-9def-352ddefb326c\") " pod="glance-kuttl-tests/keystone-bootstrap-b4gt5" Oct 02 13:15:16 crc kubenswrapper[4724]: I1002 13:15:16.673909 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/740c0d97-42e5-421d-9def-352ddefb326c-fernet-keys\") pod \"keystone-bootstrap-b4gt5\" (UID: \"740c0d97-42e5-421d-9def-352ddefb326c\") " pod="glance-kuttl-tests/keystone-bootstrap-b4gt5" Oct 02 13:15:16 crc kubenswrapper[4724]: I1002 13:15:16.673933 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/740c0d97-42e5-421d-9def-352ddefb326c-credential-keys\") pod \"keystone-bootstrap-b4gt5\" (UID: \"740c0d97-42e5-421d-9def-352ddefb326c\") " pod="glance-kuttl-tests/keystone-bootstrap-b4gt5" Oct 02 13:15:16 crc kubenswrapper[4724]: I1002 13:15:16.673989 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/740c0d97-42e5-421d-9def-352ddefb326c-scripts\") pod \"keystone-bootstrap-b4gt5\" (UID: \"740c0d97-42e5-421d-9def-352ddefb326c\") " pod="glance-kuttl-tests/keystone-bootstrap-b4gt5" Oct 02 13:15:16 crc kubenswrapper[4724]: I1002 13:15:16.775247 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jr97h\" (UniqueName: \"kubernetes.io/projected/740c0d97-42e5-421d-9def-352ddefb326c-kube-api-access-jr97h\") pod \"keystone-bootstrap-b4gt5\" (UID: \"740c0d97-42e5-421d-9def-352ddefb326c\") " pod="glance-kuttl-tests/keystone-bootstrap-b4gt5" Oct 02 13:15:16 crc kubenswrapper[4724]: I1002 13:15:16.775302 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/740c0d97-42e5-421d-9def-352ddefb326c-fernet-keys\") pod \"keystone-bootstrap-b4gt5\" (UID: \"740c0d97-42e5-421d-9def-352ddefb326c\") " pod="glance-kuttl-tests/keystone-bootstrap-b4gt5" Oct 02 13:15:16 crc kubenswrapper[4724]: I1002 13:15:16.775326 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/740c0d97-42e5-421d-9def-352ddefb326c-credential-keys\") pod \"keystone-bootstrap-b4gt5\" (UID: \"740c0d97-42e5-421d-9def-352ddefb326c\") " pod="glance-kuttl-tests/keystone-bootstrap-b4gt5" Oct 02 13:15:16 crc kubenswrapper[4724]: I1002 13:15:16.775356 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/740c0d97-42e5-421d-9def-352ddefb326c-scripts\") pod \"keystone-bootstrap-b4gt5\" (UID: \"740c0d97-42e5-421d-9def-352ddefb326c\") " pod="glance-kuttl-tests/keystone-bootstrap-b4gt5" Oct 02 13:15:16 crc kubenswrapper[4724]: I1002 13:15:16.775464 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/740c0d97-42e5-421d-9def-352ddefb326c-config-data\") pod \"keystone-bootstrap-b4gt5\" (UID: \"740c0d97-42e5-421d-9def-352ddefb326c\") " pod="glance-kuttl-tests/keystone-bootstrap-b4gt5" Oct 02 13:15:16 crc kubenswrapper[4724]: I1002 13:15:16.779779 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/740c0d97-42e5-421d-9def-352ddefb326c-scripts\") pod \"keystone-bootstrap-b4gt5\" (UID: \"740c0d97-42e5-421d-9def-352ddefb326c\") " pod="glance-kuttl-tests/keystone-bootstrap-b4gt5" Oct 02 13:15:16 crc kubenswrapper[4724]: I1002 13:15:16.780138 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/740c0d97-42e5-421d-9def-352ddefb326c-credential-keys\") pod \"keystone-bootstrap-b4gt5\" (UID: \"740c0d97-42e5-421d-9def-352ddefb326c\") " pod="glance-kuttl-tests/keystone-bootstrap-b4gt5" Oct 02 13:15:16 crc kubenswrapper[4724]: I1002 13:15:16.780672 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/740c0d97-42e5-421d-9def-352ddefb326c-config-data\") pod \"keystone-bootstrap-b4gt5\" (UID: \"740c0d97-42e5-421d-9def-352ddefb326c\") " pod="glance-kuttl-tests/keystone-bootstrap-b4gt5" Oct 02 13:15:16 crc kubenswrapper[4724]: I1002 13:15:16.784050 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/740c0d97-42e5-421d-9def-352ddefb326c-fernet-keys\") pod \"keystone-bootstrap-b4gt5\" (UID: \"740c0d97-42e5-421d-9def-352ddefb326c\") " pod="glance-kuttl-tests/keystone-bootstrap-b4gt5" Oct 02 13:15:16 crc kubenswrapper[4724]: I1002 13:15:16.793723 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jr97h\" (UniqueName: \"kubernetes.io/projected/740c0d97-42e5-421d-9def-352ddefb326c-kube-api-access-jr97h\") pod \"keystone-bootstrap-b4gt5\" (UID: \"740c0d97-42e5-421d-9def-352ddefb326c\") " pod="glance-kuttl-tests/keystone-bootstrap-b4gt5" Oct 02 13:15:16 crc kubenswrapper[4724]: I1002 13:15:16.893004 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/keystone-bootstrap-b4gt5" Oct 02 13:15:17 crc kubenswrapper[4724]: I1002 13:15:17.361491 4724 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/keystone-bootstrap-b4gt5"] Oct 02 13:15:17 crc kubenswrapper[4724]: I1002 13:15:17.375281 4724 generic.go:334] "Generic (PLEG): container finished" podID="18d91c41-08cd-4174-b468-54e2142c767e" containerID="5e707d986d5795d0e750b5c67ef96e797474a383c5da530f9579fc26d6c514de" exitCode=0 Oct 02 13:15:17 crc kubenswrapper[4724]: I1002 13:15:17.375342 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/c77910536a79801a83f49d4fd4581e5a2972791dfc31ed0ea9f0ffea32fsqpq" event={"ID":"18d91c41-08cd-4174-b468-54e2142c767e","Type":"ContainerDied","Data":"5e707d986d5795d0e750b5c67ef96e797474a383c5da530f9579fc26d6c514de"} Oct 02 13:15:18 crc kubenswrapper[4724]: I1002 13:15:18.385352 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/keystone-bootstrap-b4gt5" event={"ID":"740c0d97-42e5-421d-9def-352ddefb326c","Type":"ContainerStarted","Data":"d99e7398bd032c3cb19d91987704bc072ca77672dc6529cd6ca63f0c3ff22cdc"} Oct 02 13:15:18 crc kubenswrapper[4724]: I1002 13:15:18.385405 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/keystone-bootstrap-b4gt5" event={"ID":"740c0d97-42e5-421d-9def-352ddefb326c","Type":"ContainerStarted","Data":"8f2cbd270361251b620ca32ec7407a14a2b75843941e0e4f90571f5be5bae4a4"} Oct 02 13:15:18 crc kubenswrapper[4724]: I1002 13:15:18.410173 4724 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="glance-kuttl-tests/keystone-bootstrap-b4gt5" podStartSLOduration=2.410147303 podStartE2EDuration="2.410147303s" podCreationTimestamp="2025-10-02 13:15:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 13:15:18.402026422 +0000 UTC m=+982.856785583" watchObservedRunningTime="2025-10-02 13:15:18.410147303 +0000 UTC m=+982.864906424" Oct 02 13:15:23 crc kubenswrapper[4724]: I1002 13:15:23.460681 4724 generic.go:334] "Generic (PLEG): container finished" podID="0de57d48-755b-4e1c-a6f0-88e5cb02d827" containerID="062c440ba54249b9267d6af5b36d575e9f9007daad58f00cb0e814c70434d7ff" exitCode=0 Oct 02 13:15:23 crc kubenswrapper[4724]: I1002 13:15:23.460854 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/24f71a75347ad7a5f841d153f802ac72a45ac2356b85254a24ab5c9f58jvk6g" event={"ID":"0de57d48-755b-4e1c-a6f0-88e5cb02d827","Type":"ContainerDied","Data":"062c440ba54249b9267d6af5b36d575e9f9007daad58f00cb0e814c70434d7ff"} Oct 02 13:15:23 crc kubenswrapper[4724]: I1002 13:15:23.468049 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/c77910536a79801a83f49d4fd4581e5a2972791dfc31ed0ea9f0ffea32fsqpq" event={"ID":"18d91c41-08cd-4174-b468-54e2142c767e","Type":"ContainerStarted","Data":"ce4a2d0ac2cca8e9b8f203a6475d7dd5b1ab1f4623bd824c604e7be43acf178f"} Oct 02 13:15:24 crc kubenswrapper[4724]: I1002 13:15:24.478146 4724 generic.go:334] "Generic (PLEG): container finished" podID="0de57d48-755b-4e1c-a6f0-88e5cb02d827" containerID="322367a297c09be235b93515e5eb23b17c1de842e858c635515e8b8ddddeeb09" exitCode=0 Oct 02 13:15:24 crc kubenswrapper[4724]: I1002 13:15:24.478312 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/24f71a75347ad7a5f841d153f802ac72a45ac2356b85254a24ab5c9f58jvk6g" event={"ID":"0de57d48-755b-4e1c-a6f0-88e5cb02d827","Type":"ContainerDied","Data":"322367a297c09be235b93515e5eb23b17c1de842e858c635515e8b8ddddeeb09"} Oct 02 13:15:24 crc kubenswrapper[4724]: I1002 13:15:24.481625 4724 generic.go:334] "Generic (PLEG): container finished" podID="740c0d97-42e5-421d-9def-352ddefb326c" containerID="d99e7398bd032c3cb19d91987704bc072ca77672dc6529cd6ca63f0c3ff22cdc" exitCode=0 Oct 02 13:15:24 crc kubenswrapper[4724]: I1002 13:15:24.481997 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/keystone-bootstrap-b4gt5" event={"ID":"740c0d97-42e5-421d-9def-352ddefb326c","Type":"ContainerDied","Data":"d99e7398bd032c3cb19d91987704bc072ca77672dc6529cd6ca63f0c3ff22cdc"} Oct 02 13:15:24 crc kubenswrapper[4724]: I1002 13:15:24.484629 4724 generic.go:334] "Generic (PLEG): container finished" podID="18d91c41-08cd-4174-b468-54e2142c767e" containerID="ce4a2d0ac2cca8e9b8f203a6475d7dd5b1ab1f4623bd824c604e7be43acf178f" exitCode=0 Oct 02 13:15:24 crc kubenswrapper[4724]: I1002 13:15:24.484838 4724 generic.go:334] "Generic (PLEG): container finished" podID="18d91c41-08cd-4174-b468-54e2142c767e" containerID="c9ffc9c5b3df7dcbf74be659a91f022a81cab066188febcf2ba45855fcbac0a7" exitCode=0 Oct 02 13:15:24 crc kubenswrapper[4724]: I1002 13:15:24.484706 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/c77910536a79801a83f49d4fd4581e5a2972791dfc31ed0ea9f0ffea32fsqpq" event={"ID":"18d91c41-08cd-4174-b468-54e2142c767e","Type":"ContainerDied","Data":"ce4a2d0ac2cca8e9b8f203a6475d7dd5b1ab1f4623bd824c604e7be43acf178f"} Oct 02 13:15:24 crc kubenswrapper[4724]: I1002 13:15:24.485053 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/c77910536a79801a83f49d4fd4581e5a2972791dfc31ed0ea9f0ffea32fsqpq" event={"ID":"18d91c41-08cd-4174-b468-54e2142c767e","Type":"ContainerDied","Data":"c9ffc9c5b3df7dcbf74be659a91f022a81cab066188febcf2ba45855fcbac0a7"} Oct 02 13:15:25 crc kubenswrapper[4724]: I1002 13:15:25.856967 4724 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/c77910536a79801a83f49d4fd4581e5a2972791dfc31ed0ea9f0ffea32fsqpq" Oct 02 13:15:25 crc kubenswrapper[4724]: I1002 13:15:25.935350 4724 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/keystone-bootstrap-b4gt5" Oct 02 13:15:25 crc kubenswrapper[4724]: I1002 13:15:25.935483 4724 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/24f71a75347ad7a5f841d153f802ac72a45ac2356b85254a24ab5c9f58jvk6g" Oct 02 13:15:25 crc kubenswrapper[4724]: I1002 13:15:25.939362 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/18d91c41-08cd-4174-b468-54e2142c767e-bundle\") pod \"18d91c41-08cd-4174-b468-54e2142c767e\" (UID: \"18d91c41-08cd-4174-b468-54e2142c767e\") " Oct 02 13:15:25 crc kubenswrapper[4724]: I1002 13:15:25.939436 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6km2j\" (UniqueName: \"kubernetes.io/projected/18d91c41-08cd-4174-b468-54e2142c767e-kube-api-access-6km2j\") pod \"18d91c41-08cd-4174-b468-54e2142c767e\" (UID: \"18d91c41-08cd-4174-b468-54e2142c767e\") " Oct 02 13:15:25 crc kubenswrapper[4724]: I1002 13:15:25.939453 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/740c0d97-42e5-421d-9def-352ddefb326c-credential-keys\") pod \"740c0d97-42e5-421d-9def-352ddefb326c\" (UID: \"740c0d97-42e5-421d-9def-352ddefb326c\") " Oct 02 13:15:25 crc kubenswrapper[4724]: I1002 13:15:25.939473 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/740c0d97-42e5-421d-9def-352ddefb326c-config-data\") pod \"740c0d97-42e5-421d-9def-352ddefb326c\" (UID: \"740c0d97-42e5-421d-9def-352ddefb326c\") " Oct 02 13:15:25 crc kubenswrapper[4724]: I1002 13:15:25.939521 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/740c0d97-42e5-421d-9def-352ddefb326c-fernet-keys\") pod \"740c0d97-42e5-421d-9def-352ddefb326c\" (UID: \"740c0d97-42e5-421d-9def-352ddefb326c\") " Oct 02 13:15:25 crc kubenswrapper[4724]: I1002 13:15:25.939575 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/0de57d48-755b-4e1c-a6f0-88e5cb02d827-bundle\") pod \"0de57d48-755b-4e1c-a6f0-88e5cb02d827\" (UID: \"0de57d48-755b-4e1c-a6f0-88e5cb02d827\") " Oct 02 13:15:25 crc kubenswrapper[4724]: I1002 13:15:25.939625 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jr97h\" (UniqueName: \"kubernetes.io/projected/740c0d97-42e5-421d-9def-352ddefb326c-kube-api-access-jr97h\") pod \"740c0d97-42e5-421d-9def-352ddefb326c\" (UID: \"740c0d97-42e5-421d-9def-352ddefb326c\") " Oct 02 13:15:25 crc kubenswrapper[4724]: I1002 13:15:25.939643 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/0de57d48-755b-4e1c-a6f0-88e5cb02d827-util\") pod \"0de57d48-755b-4e1c-a6f0-88e5cb02d827\" (UID: \"0de57d48-755b-4e1c-a6f0-88e5cb02d827\") " Oct 02 13:15:25 crc kubenswrapper[4724]: I1002 13:15:25.939660 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tfr4c\" (UniqueName: \"kubernetes.io/projected/0de57d48-755b-4e1c-a6f0-88e5cb02d827-kube-api-access-tfr4c\") pod \"0de57d48-755b-4e1c-a6f0-88e5cb02d827\" (UID: \"0de57d48-755b-4e1c-a6f0-88e5cb02d827\") " Oct 02 13:15:25 crc kubenswrapper[4724]: I1002 13:15:25.939706 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/18d91c41-08cd-4174-b468-54e2142c767e-util\") pod \"18d91c41-08cd-4174-b468-54e2142c767e\" (UID: \"18d91c41-08cd-4174-b468-54e2142c767e\") " Oct 02 13:15:25 crc kubenswrapper[4724]: I1002 13:15:25.939727 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/740c0d97-42e5-421d-9def-352ddefb326c-scripts\") pod \"740c0d97-42e5-421d-9def-352ddefb326c\" (UID: \"740c0d97-42e5-421d-9def-352ddefb326c\") " Oct 02 13:15:25 crc kubenswrapper[4724]: I1002 13:15:25.940592 4724 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0de57d48-755b-4e1c-a6f0-88e5cb02d827-bundle" (OuterVolumeSpecName: "bundle") pod "0de57d48-755b-4e1c-a6f0-88e5cb02d827" (UID: "0de57d48-755b-4e1c-a6f0-88e5cb02d827"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 13:15:25 crc kubenswrapper[4724]: I1002 13:15:25.941745 4724 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/18d91c41-08cd-4174-b468-54e2142c767e-bundle" (OuterVolumeSpecName: "bundle") pod "18d91c41-08cd-4174-b468-54e2142c767e" (UID: "18d91c41-08cd-4174-b468-54e2142c767e"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 13:15:25 crc kubenswrapper[4724]: I1002 13:15:25.946003 4724 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0de57d48-755b-4e1c-a6f0-88e5cb02d827-kube-api-access-tfr4c" (OuterVolumeSpecName: "kube-api-access-tfr4c") pod "0de57d48-755b-4e1c-a6f0-88e5cb02d827" (UID: "0de57d48-755b-4e1c-a6f0-88e5cb02d827"). InnerVolumeSpecName "kube-api-access-tfr4c". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 13:15:25 crc kubenswrapper[4724]: I1002 13:15:25.948990 4724 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/740c0d97-42e5-421d-9def-352ddefb326c-scripts" (OuterVolumeSpecName: "scripts") pod "740c0d97-42e5-421d-9def-352ddefb326c" (UID: "740c0d97-42e5-421d-9def-352ddefb326c"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 13:15:25 crc kubenswrapper[4724]: I1002 13:15:25.949036 4724 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/18d91c41-08cd-4174-b468-54e2142c767e-kube-api-access-6km2j" (OuterVolumeSpecName: "kube-api-access-6km2j") pod "18d91c41-08cd-4174-b468-54e2142c767e" (UID: "18d91c41-08cd-4174-b468-54e2142c767e"). InnerVolumeSpecName "kube-api-access-6km2j". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 13:15:25 crc kubenswrapper[4724]: I1002 13:15:25.951665 4724 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/740c0d97-42e5-421d-9def-352ddefb326c-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "740c0d97-42e5-421d-9def-352ddefb326c" (UID: "740c0d97-42e5-421d-9def-352ddefb326c"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 13:15:25 crc kubenswrapper[4724]: I1002 13:15:25.952333 4724 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0de57d48-755b-4e1c-a6f0-88e5cb02d827-util" (OuterVolumeSpecName: "util") pod "0de57d48-755b-4e1c-a6f0-88e5cb02d827" (UID: "0de57d48-755b-4e1c-a6f0-88e5cb02d827"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 13:15:25 crc kubenswrapper[4724]: I1002 13:15:25.953723 4724 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/740c0d97-42e5-421d-9def-352ddefb326c-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "740c0d97-42e5-421d-9def-352ddefb326c" (UID: "740c0d97-42e5-421d-9def-352ddefb326c"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 13:15:25 crc kubenswrapper[4724]: I1002 13:15:25.959314 4724 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/740c0d97-42e5-421d-9def-352ddefb326c-kube-api-access-jr97h" (OuterVolumeSpecName: "kube-api-access-jr97h") pod "740c0d97-42e5-421d-9def-352ddefb326c" (UID: "740c0d97-42e5-421d-9def-352ddefb326c"). InnerVolumeSpecName "kube-api-access-jr97h". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 13:15:25 crc kubenswrapper[4724]: I1002 13:15:25.972912 4724 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/740c0d97-42e5-421d-9def-352ddefb326c-config-data" (OuterVolumeSpecName: "config-data") pod "740c0d97-42e5-421d-9def-352ddefb326c" (UID: "740c0d97-42e5-421d-9def-352ddefb326c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 13:15:25 crc kubenswrapper[4724]: I1002 13:15:25.979650 4724 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/18d91c41-08cd-4174-b468-54e2142c767e-util" (OuterVolumeSpecName: "util") pod "18d91c41-08cd-4174-b468-54e2142c767e" (UID: "18d91c41-08cd-4174-b468-54e2142c767e"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 13:15:26 crc kubenswrapper[4724]: I1002 13:15:26.040972 4724 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6km2j\" (UniqueName: \"kubernetes.io/projected/18d91c41-08cd-4174-b468-54e2142c767e-kube-api-access-6km2j\") on node \"crc\" DevicePath \"\"" Oct 02 13:15:26 crc kubenswrapper[4724]: I1002 13:15:26.041019 4724 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/740c0d97-42e5-421d-9def-352ddefb326c-credential-keys\") on node \"crc\" DevicePath \"\"" Oct 02 13:15:26 crc kubenswrapper[4724]: I1002 13:15:26.041035 4724 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/740c0d97-42e5-421d-9def-352ddefb326c-config-data\") on node \"crc\" DevicePath \"\"" Oct 02 13:15:26 crc kubenswrapper[4724]: I1002 13:15:26.041047 4724 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/740c0d97-42e5-421d-9def-352ddefb326c-fernet-keys\") on node \"crc\" DevicePath \"\"" Oct 02 13:15:26 crc kubenswrapper[4724]: I1002 13:15:26.041059 4724 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/0de57d48-755b-4e1c-a6f0-88e5cb02d827-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 13:15:26 crc kubenswrapper[4724]: I1002 13:15:26.041068 4724 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jr97h\" (UniqueName: \"kubernetes.io/projected/740c0d97-42e5-421d-9def-352ddefb326c-kube-api-access-jr97h\") on node \"crc\" DevicePath \"\"" Oct 02 13:15:26 crc kubenswrapper[4724]: I1002 13:15:26.041077 4724 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/0de57d48-755b-4e1c-a6f0-88e5cb02d827-util\") on node \"crc\" DevicePath \"\"" Oct 02 13:15:26 crc kubenswrapper[4724]: I1002 13:15:26.041085 4724 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tfr4c\" (UniqueName: \"kubernetes.io/projected/0de57d48-755b-4e1c-a6f0-88e5cb02d827-kube-api-access-tfr4c\") on node \"crc\" DevicePath \"\"" Oct 02 13:15:26 crc kubenswrapper[4724]: I1002 13:15:26.041093 4724 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/18d91c41-08cd-4174-b468-54e2142c767e-util\") on node \"crc\" DevicePath \"\"" Oct 02 13:15:26 crc kubenswrapper[4724]: I1002 13:15:26.041101 4724 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/740c0d97-42e5-421d-9def-352ddefb326c-scripts\") on node \"crc\" DevicePath \"\"" Oct 02 13:15:26 crc kubenswrapper[4724]: I1002 13:15:26.041110 4724 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/18d91c41-08cd-4174-b468-54e2142c767e-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 13:15:26 crc kubenswrapper[4724]: I1002 13:15:26.503923 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/24f71a75347ad7a5f841d153f802ac72a45ac2356b85254a24ab5c9f58jvk6g" event={"ID":"0de57d48-755b-4e1c-a6f0-88e5cb02d827","Type":"ContainerDied","Data":"d770a8c4362b870c5d6b2c64bfae0d9dd32ded766dcbf6cb83bc6454868da12c"} Oct 02 13:15:26 crc kubenswrapper[4724]: I1002 13:15:26.503966 4724 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d770a8c4362b870c5d6b2c64bfae0d9dd32ded766dcbf6cb83bc6454868da12c" Oct 02 13:15:26 crc kubenswrapper[4724]: I1002 13:15:26.503981 4724 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/24f71a75347ad7a5f841d153f802ac72a45ac2356b85254a24ab5c9f58jvk6g" Oct 02 13:15:26 crc kubenswrapper[4724]: I1002 13:15:26.507446 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/keystone-bootstrap-b4gt5" event={"ID":"740c0d97-42e5-421d-9def-352ddefb326c","Type":"ContainerDied","Data":"8f2cbd270361251b620ca32ec7407a14a2b75843941e0e4f90571f5be5bae4a4"} Oct 02 13:15:26 crc kubenswrapper[4724]: I1002 13:15:26.507479 4724 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8f2cbd270361251b620ca32ec7407a14a2b75843941e0e4f90571f5be5bae4a4" Oct 02 13:15:26 crc kubenswrapper[4724]: I1002 13:15:26.507529 4724 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/keystone-bootstrap-b4gt5" Oct 02 13:15:26 crc kubenswrapper[4724]: I1002 13:15:26.511852 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/c77910536a79801a83f49d4fd4581e5a2972791dfc31ed0ea9f0ffea32fsqpq" event={"ID":"18d91c41-08cd-4174-b468-54e2142c767e","Type":"ContainerDied","Data":"f978efec1788c6b09aecdef8fb7906ca657240e93ecb15cca8377fc54f8655d5"} Oct 02 13:15:26 crc kubenswrapper[4724]: I1002 13:15:26.511882 4724 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f978efec1788c6b09aecdef8fb7906ca657240e93ecb15cca8377fc54f8655d5" Oct 02 13:15:26 crc kubenswrapper[4724]: I1002 13:15:26.511939 4724 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/c77910536a79801a83f49d4fd4581e5a2972791dfc31ed0ea9f0ffea32fsqpq" Oct 02 13:15:26 crc kubenswrapper[4724]: I1002 13:15:26.619784 4724 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/keystone-86d476bccd-dbrt4"] Oct 02 13:15:26 crc kubenswrapper[4724]: E1002 13:15:26.620675 4724 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="18d91c41-08cd-4174-b468-54e2142c767e" containerName="extract" Oct 02 13:15:26 crc kubenswrapper[4724]: I1002 13:15:26.620824 4724 state_mem.go:107] "Deleted CPUSet assignment" podUID="18d91c41-08cd-4174-b468-54e2142c767e" containerName="extract" Oct 02 13:15:26 crc kubenswrapper[4724]: E1002 13:15:26.620937 4724 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="18d91c41-08cd-4174-b468-54e2142c767e" containerName="pull" Oct 02 13:15:26 crc kubenswrapper[4724]: I1002 13:15:26.621036 4724 state_mem.go:107] "Deleted CPUSet assignment" podUID="18d91c41-08cd-4174-b468-54e2142c767e" containerName="pull" Oct 02 13:15:26 crc kubenswrapper[4724]: E1002 13:15:26.621126 4724 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="18d91c41-08cd-4174-b468-54e2142c767e" containerName="util" Oct 02 13:15:26 crc kubenswrapper[4724]: I1002 13:15:26.621216 4724 state_mem.go:107] "Deleted CPUSet assignment" podUID="18d91c41-08cd-4174-b468-54e2142c767e" containerName="util" Oct 02 13:15:26 crc kubenswrapper[4724]: E1002 13:15:26.621292 4724 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="740c0d97-42e5-421d-9def-352ddefb326c" containerName="keystone-bootstrap" Oct 02 13:15:26 crc kubenswrapper[4724]: I1002 13:15:26.621377 4724 state_mem.go:107] "Deleted CPUSet assignment" podUID="740c0d97-42e5-421d-9def-352ddefb326c" containerName="keystone-bootstrap" Oct 02 13:15:26 crc kubenswrapper[4724]: E1002 13:15:26.621480 4724 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0de57d48-755b-4e1c-a6f0-88e5cb02d827" containerName="util" Oct 02 13:15:26 crc kubenswrapper[4724]: I1002 13:15:26.621589 4724 state_mem.go:107] "Deleted CPUSet assignment" podUID="0de57d48-755b-4e1c-a6f0-88e5cb02d827" containerName="util" Oct 02 13:15:26 crc kubenswrapper[4724]: E1002 13:15:26.621683 4724 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0de57d48-755b-4e1c-a6f0-88e5cb02d827" containerName="pull" Oct 02 13:15:26 crc kubenswrapper[4724]: I1002 13:15:26.621811 4724 state_mem.go:107] "Deleted CPUSet assignment" podUID="0de57d48-755b-4e1c-a6f0-88e5cb02d827" containerName="pull" Oct 02 13:15:26 crc kubenswrapper[4724]: E1002 13:15:26.621926 4724 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0de57d48-755b-4e1c-a6f0-88e5cb02d827" containerName="extract" Oct 02 13:15:26 crc kubenswrapper[4724]: I1002 13:15:26.622042 4724 state_mem.go:107] "Deleted CPUSet assignment" podUID="0de57d48-755b-4e1c-a6f0-88e5cb02d827" containerName="extract" Oct 02 13:15:26 crc kubenswrapper[4724]: I1002 13:15:26.622345 4724 memory_manager.go:354] "RemoveStaleState removing state" podUID="740c0d97-42e5-421d-9def-352ddefb326c" containerName="keystone-bootstrap" Oct 02 13:15:26 crc kubenswrapper[4724]: I1002 13:15:26.622703 4724 memory_manager.go:354] "RemoveStaleState removing state" podUID="0de57d48-755b-4e1c-a6f0-88e5cb02d827" containerName="extract" Oct 02 13:15:26 crc kubenswrapper[4724]: I1002 13:15:26.622805 4724 memory_manager.go:354] "RemoveStaleState removing state" podUID="18d91c41-08cd-4174-b468-54e2142c767e" containerName="extract" Oct 02 13:15:26 crc kubenswrapper[4724]: I1002 13:15:26.623802 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/keystone-86d476bccd-dbrt4" Oct 02 13:15:26 crc kubenswrapper[4724]: I1002 13:15:26.627281 4724 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"keystone-keystone-dockercfg-sgttv" Oct 02 13:15:26 crc kubenswrapper[4724]: I1002 13:15:26.627281 4724 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"keystone-config-data" Oct 02 13:15:26 crc kubenswrapper[4724]: I1002 13:15:26.627662 4724 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"keystone-scripts" Oct 02 13:15:26 crc kubenswrapper[4724]: I1002 13:15:26.630755 4724 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"keystone" Oct 02 13:15:26 crc kubenswrapper[4724]: I1002 13:15:26.632393 4724 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/keystone-86d476bccd-dbrt4"] Oct 02 13:15:26 crc kubenswrapper[4724]: I1002 13:15:26.750686 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a7202e55-07f4-4190-8fc9-7ff3d6c5581f-scripts\") pod \"keystone-86d476bccd-dbrt4\" (UID: \"a7202e55-07f4-4190-8fc9-7ff3d6c5581f\") " pod="glance-kuttl-tests/keystone-86d476bccd-dbrt4" Oct 02 13:15:26 crc kubenswrapper[4724]: I1002 13:15:26.750848 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5bzph\" (UniqueName: \"kubernetes.io/projected/a7202e55-07f4-4190-8fc9-7ff3d6c5581f-kube-api-access-5bzph\") pod \"keystone-86d476bccd-dbrt4\" (UID: \"a7202e55-07f4-4190-8fc9-7ff3d6c5581f\") " pod="glance-kuttl-tests/keystone-86d476bccd-dbrt4" Oct 02 13:15:26 crc kubenswrapper[4724]: I1002 13:15:26.750902 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/a7202e55-07f4-4190-8fc9-7ff3d6c5581f-credential-keys\") pod \"keystone-86d476bccd-dbrt4\" (UID: \"a7202e55-07f4-4190-8fc9-7ff3d6c5581f\") " pod="glance-kuttl-tests/keystone-86d476bccd-dbrt4" Oct 02 13:15:26 crc kubenswrapper[4724]: I1002 13:15:26.751050 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a7202e55-07f4-4190-8fc9-7ff3d6c5581f-config-data\") pod \"keystone-86d476bccd-dbrt4\" (UID: \"a7202e55-07f4-4190-8fc9-7ff3d6c5581f\") " pod="glance-kuttl-tests/keystone-86d476bccd-dbrt4" Oct 02 13:15:26 crc kubenswrapper[4724]: I1002 13:15:26.751198 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/a7202e55-07f4-4190-8fc9-7ff3d6c5581f-fernet-keys\") pod \"keystone-86d476bccd-dbrt4\" (UID: \"a7202e55-07f4-4190-8fc9-7ff3d6c5581f\") " pod="glance-kuttl-tests/keystone-86d476bccd-dbrt4" Oct 02 13:15:26 crc kubenswrapper[4724]: I1002 13:15:26.852206 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/a7202e55-07f4-4190-8fc9-7ff3d6c5581f-credential-keys\") pod \"keystone-86d476bccd-dbrt4\" (UID: \"a7202e55-07f4-4190-8fc9-7ff3d6c5581f\") " pod="glance-kuttl-tests/keystone-86d476bccd-dbrt4" Oct 02 13:15:26 crc kubenswrapper[4724]: I1002 13:15:26.852287 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5bzph\" (UniqueName: \"kubernetes.io/projected/a7202e55-07f4-4190-8fc9-7ff3d6c5581f-kube-api-access-5bzph\") pod \"keystone-86d476bccd-dbrt4\" (UID: \"a7202e55-07f4-4190-8fc9-7ff3d6c5581f\") " pod="glance-kuttl-tests/keystone-86d476bccd-dbrt4" Oct 02 13:15:26 crc kubenswrapper[4724]: I1002 13:15:26.852378 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a7202e55-07f4-4190-8fc9-7ff3d6c5581f-config-data\") pod \"keystone-86d476bccd-dbrt4\" (UID: \"a7202e55-07f4-4190-8fc9-7ff3d6c5581f\") " pod="glance-kuttl-tests/keystone-86d476bccd-dbrt4" Oct 02 13:15:26 crc kubenswrapper[4724]: I1002 13:15:26.852477 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/a7202e55-07f4-4190-8fc9-7ff3d6c5581f-fernet-keys\") pod \"keystone-86d476bccd-dbrt4\" (UID: \"a7202e55-07f4-4190-8fc9-7ff3d6c5581f\") " pod="glance-kuttl-tests/keystone-86d476bccd-dbrt4" Oct 02 13:15:26 crc kubenswrapper[4724]: I1002 13:15:26.852556 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a7202e55-07f4-4190-8fc9-7ff3d6c5581f-scripts\") pod \"keystone-86d476bccd-dbrt4\" (UID: \"a7202e55-07f4-4190-8fc9-7ff3d6c5581f\") " pod="glance-kuttl-tests/keystone-86d476bccd-dbrt4" Oct 02 13:15:26 crc kubenswrapper[4724]: I1002 13:15:26.856673 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a7202e55-07f4-4190-8fc9-7ff3d6c5581f-scripts\") pod \"keystone-86d476bccd-dbrt4\" (UID: \"a7202e55-07f4-4190-8fc9-7ff3d6c5581f\") " pod="glance-kuttl-tests/keystone-86d476bccd-dbrt4" Oct 02 13:15:26 crc kubenswrapper[4724]: I1002 13:15:26.857704 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/a7202e55-07f4-4190-8fc9-7ff3d6c5581f-fernet-keys\") pod \"keystone-86d476bccd-dbrt4\" (UID: \"a7202e55-07f4-4190-8fc9-7ff3d6c5581f\") " pod="glance-kuttl-tests/keystone-86d476bccd-dbrt4" Oct 02 13:15:26 crc kubenswrapper[4724]: I1002 13:15:26.857862 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/a7202e55-07f4-4190-8fc9-7ff3d6c5581f-credential-keys\") pod \"keystone-86d476bccd-dbrt4\" (UID: \"a7202e55-07f4-4190-8fc9-7ff3d6c5581f\") " pod="glance-kuttl-tests/keystone-86d476bccd-dbrt4" Oct 02 13:15:26 crc kubenswrapper[4724]: I1002 13:15:26.859181 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a7202e55-07f4-4190-8fc9-7ff3d6c5581f-config-data\") pod \"keystone-86d476bccd-dbrt4\" (UID: \"a7202e55-07f4-4190-8fc9-7ff3d6c5581f\") " pod="glance-kuttl-tests/keystone-86d476bccd-dbrt4" Oct 02 13:15:26 crc kubenswrapper[4724]: I1002 13:15:26.871328 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5bzph\" (UniqueName: \"kubernetes.io/projected/a7202e55-07f4-4190-8fc9-7ff3d6c5581f-kube-api-access-5bzph\") pod \"keystone-86d476bccd-dbrt4\" (UID: \"a7202e55-07f4-4190-8fc9-7ff3d6c5581f\") " pod="glance-kuttl-tests/keystone-86d476bccd-dbrt4" Oct 02 13:15:26 crc kubenswrapper[4724]: I1002 13:15:26.947704 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/keystone-86d476bccd-dbrt4" Oct 02 13:15:27 crc kubenswrapper[4724]: I1002 13:15:27.391810 4724 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/keystone-86d476bccd-dbrt4"] Oct 02 13:15:27 crc kubenswrapper[4724]: W1002 13:15:27.395367 4724 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda7202e55_07f4_4190_8fc9_7ff3d6c5581f.slice/crio-0db922b0065af4f675c49f477af1d04327122707a9e769c18f4dd3f782d011a1 WatchSource:0}: Error finding container 0db922b0065af4f675c49f477af1d04327122707a9e769c18f4dd3f782d011a1: Status 404 returned error can't find the container with id 0db922b0065af4f675c49f477af1d04327122707a9e769c18f4dd3f782d011a1 Oct 02 13:15:27 crc kubenswrapper[4724]: I1002 13:15:27.523927 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/keystone-86d476bccd-dbrt4" event={"ID":"a7202e55-07f4-4190-8fc9-7ff3d6c5581f","Type":"ContainerStarted","Data":"0db922b0065af4f675c49f477af1d04327122707a9e769c18f4dd3f782d011a1"} Oct 02 13:15:28 crc kubenswrapper[4724]: I1002 13:15:28.534147 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/keystone-86d476bccd-dbrt4" event={"ID":"a7202e55-07f4-4190-8fc9-7ff3d6c5581f","Type":"ContainerStarted","Data":"90425df917f411243370db238ce7732be4da3413830eb8b23629b5423673d2f5"} Oct 02 13:15:28 crc kubenswrapper[4724]: I1002 13:15:28.534637 4724 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/keystone-86d476bccd-dbrt4" Oct 02 13:15:28 crc kubenswrapper[4724]: I1002 13:15:28.554987 4724 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="glance-kuttl-tests/keystone-86d476bccd-dbrt4" podStartSLOduration=2.5547931520000002 podStartE2EDuration="2.554793152s" podCreationTimestamp="2025-10-02 13:15:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 13:15:28.553111669 +0000 UTC m=+993.007870790" watchObservedRunningTime="2025-10-02 13:15:28.554793152 +0000 UTC m=+993.009552293" Oct 02 13:15:34 crc kubenswrapper[4724]: I1002 13:15:34.733889 4724 patch_prober.go:28] interesting pod/machine-config-daemon-74k4t container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 02 13:15:34 crc kubenswrapper[4724]: I1002 13:15:34.734428 4724 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-74k4t" podUID="f6090eaa-c182-4788-950c-16352c271233" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 02 13:15:34 crc kubenswrapper[4724]: I1002 13:15:34.734474 4724 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-74k4t" Oct 02 13:15:34 crc kubenswrapper[4724]: I1002 13:15:34.735095 4724 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"0a68cb5a6d61b6854f57fe6390e5dda2f41ea0bca0a949b5592be96925084795"} pod="openshift-machine-config-operator/machine-config-daemon-74k4t" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 02 13:15:34 crc kubenswrapper[4724]: I1002 13:15:34.735147 4724 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-74k4t" podUID="f6090eaa-c182-4788-950c-16352c271233" containerName="machine-config-daemon" containerID="cri-o://0a68cb5a6d61b6854f57fe6390e5dda2f41ea0bca0a949b5592be96925084795" gracePeriod=600 Oct 02 13:15:35 crc kubenswrapper[4724]: I1002 13:15:35.586983 4724 generic.go:334] "Generic (PLEG): container finished" podID="f6090eaa-c182-4788-950c-16352c271233" containerID="0a68cb5a6d61b6854f57fe6390e5dda2f41ea0bca0a949b5592be96925084795" exitCode=0 Oct 02 13:15:35 crc kubenswrapper[4724]: I1002 13:15:35.587034 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-74k4t" event={"ID":"f6090eaa-c182-4788-950c-16352c271233","Type":"ContainerDied","Data":"0a68cb5a6d61b6854f57fe6390e5dda2f41ea0bca0a949b5592be96925084795"} Oct 02 13:15:35 crc kubenswrapper[4724]: I1002 13:15:35.587361 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-74k4t" event={"ID":"f6090eaa-c182-4788-950c-16352c271233","Type":"ContainerStarted","Data":"d6edbaca1be551c79f462bf303a060c3a2f4d99fd2847faa868ec902caa0b3e8"} Oct 02 13:15:35 crc kubenswrapper[4724]: I1002 13:15:35.587390 4724 scope.go:117] "RemoveContainer" containerID="dec5500d23c7c01c852c0e2b0478c7abcfcd5e96c2408d1a5b49547642fc64d1" Oct 02 13:15:43 crc kubenswrapper[4724]: I1002 13:15:43.379857 4724 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/swift-operator-controller-manager-5947468b68-64ngp"] Oct 02 13:15:43 crc kubenswrapper[4724]: I1002 13:15:43.381781 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-5947468b68-64ngp" Oct 02 13:15:43 crc kubenswrapper[4724]: I1002 13:15:43.383870 4724 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"swift-operator-controller-manager-service-cert" Oct 02 13:15:43 crc kubenswrapper[4724]: I1002 13:15:43.384202 4724 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"swift-operator-controller-manager-dockercfg-rq7jh" Oct 02 13:15:43 crc kubenswrapper[4724]: I1002 13:15:43.449188 4724 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-5947468b68-64ngp"] Oct 02 13:15:43 crc kubenswrapper[4724]: I1002 13:15:43.495392 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p5j5k\" (UniqueName: \"kubernetes.io/projected/9e2ec7b8-85ef-400d-ac94-39a733e729aa-kube-api-access-p5j5k\") pod \"swift-operator-controller-manager-5947468b68-64ngp\" (UID: \"9e2ec7b8-85ef-400d-ac94-39a733e729aa\") " pod="openstack-operators/swift-operator-controller-manager-5947468b68-64ngp" Oct 02 13:15:43 crc kubenswrapper[4724]: I1002 13:15:43.495485 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/9e2ec7b8-85ef-400d-ac94-39a733e729aa-webhook-cert\") pod \"swift-operator-controller-manager-5947468b68-64ngp\" (UID: \"9e2ec7b8-85ef-400d-ac94-39a733e729aa\") " pod="openstack-operators/swift-operator-controller-manager-5947468b68-64ngp" Oct 02 13:15:43 crc kubenswrapper[4724]: I1002 13:15:43.495526 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/9e2ec7b8-85ef-400d-ac94-39a733e729aa-apiservice-cert\") pod \"swift-operator-controller-manager-5947468b68-64ngp\" (UID: \"9e2ec7b8-85ef-400d-ac94-39a733e729aa\") " pod="openstack-operators/swift-operator-controller-manager-5947468b68-64ngp" Oct 02 13:15:43 crc kubenswrapper[4724]: I1002 13:15:43.597345 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/9e2ec7b8-85ef-400d-ac94-39a733e729aa-webhook-cert\") pod \"swift-operator-controller-manager-5947468b68-64ngp\" (UID: \"9e2ec7b8-85ef-400d-ac94-39a733e729aa\") " pod="openstack-operators/swift-operator-controller-manager-5947468b68-64ngp" Oct 02 13:15:43 crc kubenswrapper[4724]: I1002 13:15:43.597415 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/9e2ec7b8-85ef-400d-ac94-39a733e729aa-apiservice-cert\") pod \"swift-operator-controller-manager-5947468b68-64ngp\" (UID: \"9e2ec7b8-85ef-400d-ac94-39a733e729aa\") " pod="openstack-operators/swift-operator-controller-manager-5947468b68-64ngp" Oct 02 13:15:43 crc kubenswrapper[4724]: I1002 13:15:43.597491 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p5j5k\" (UniqueName: \"kubernetes.io/projected/9e2ec7b8-85ef-400d-ac94-39a733e729aa-kube-api-access-p5j5k\") pod \"swift-operator-controller-manager-5947468b68-64ngp\" (UID: \"9e2ec7b8-85ef-400d-ac94-39a733e729aa\") " pod="openstack-operators/swift-operator-controller-manager-5947468b68-64ngp" Oct 02 13:15:43 crc kubenswrapper[4724]: I1002 13:15:43.606680 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/9e2ec7b8-85ef-400d-ac94-39a733e729aa-apiservice-cert\") pod \"swift-operator-controller-manager-5947468b68-64ngp\" (UID: \"9e2ec7b8-85ef-400d-ac94-39a733e729aa\") " pod="openstack-operators/swift-operator-controller-manager-5947468b68-64ngp" Oct 02 13:15:43 crc kubenswrapper[4724]: I1002 13:15:43.609298 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/9e2ec7b8-85ef-400d-ac94-39a733e729aa-webhook-cert\") pod \"swift-operator-controller-manager-5947468b68-64ngp\" (UID: \"9e2ec7b8-85ef-400d-ac94-39a733e729aa\") " pod="openstack-operators/swift-operator-controller-manager-5947468b68-64ngp" Oct 02 13:15:43 crc kubenswrapper[4724]: I1002 13:15:43.623496 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p5j5k\" (UniqueName: \"kubernetes.io/projected/9e2ec7b8-85ef-400d-ac94-39a733e729aa-kube-api-access-p5j5k\") pod \"swift-operator-controller-manager-5947468b68-64ngp\" (UID: \"9e2ec7b8-85ef-400d-ac94-39a733e729aa\") " pod="openstack-operators/swift-operator-controller-manager-5947468b68-64ngp" Oct 02 13:15:43 crc kubenswrapper[4724]: I1002 13:15:43.702872 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-5947468b68-64ngp" Oct 02 13:15:43 crc kubenswrapper[4724]: I1002 13:15:43.885768 4724 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/horizon-operator-controller-manager-55ffbdd8b6-6mvsc"] Oct 02 13:15:43 crc kubenswrapper[4724]: I1002 13:15:43.887190 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-55ffbdd8b6-6mvsc" Oct 02 13:15:43 crc kubenswrapper[4724]: I1002 13:15:43.890915 4724 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"horizon-operator-controller-manager-dockercfg-59sgr" Oct 02 13:15:43 crc kubenswrapper[4724]: I1002 13:15:43.891160 4724 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"horizon-operator-controller-manager-service-cert" Oct 02 13:15:43 crc kubenswrapper[4724]: I1002 13:15:43.906709 4724 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-55ffbdd8b6-6mvsc"] Oct 02 13:15:44 crc kubenswrapper[4724]: I1002 13:15:44.005095 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/2189dcc9-69ab-445f-83a7-2491a7ecb038-webhook-cert\") pod \"horizon-operator-controller-manager-55ffbdd8b6-6mvsc\" (UID: \"2189dcc9-69ab-445f-83a7-2491a7ecb038\") " pod="openstack-operators/horizon-operator-controller-manager-55ffbdd8b6-6mvsc" Oct 02 13:15:44 crc kubenswrapper[4724]: I1002 13:15:44.005193 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/2189dcc9-69ab-445f-83a7-2491a7ecb038-apiservice-cert\") pod \"horizon-operator-controller-manager-55ffbdd8b6-6mvsc\" (UID: \"2189dcc9-69ab-445f-83a7-2491a7ecb038\") " pod="openstack-operators/horizon-operator-controller-manager-55ffbdd8b6-6mvsc" Oct 02 13:15:44 crc kubenswrapper[4724]: I1002 13:15:44.005260 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mk26r\" (UniqueName: \"kubernetes.io/projected/2189dcc9-69ab-445f-83a7-2491a7ecb038-kube-api-access-mk26r\") pod \"horizon-operator-controller-manager-55ffbdd8b6-6mvsc\" (UID: \"2189dcc9-69ab-445f-83a7-2491a7ecb038\") " pod="openstack-operators/horizon-operator-controller-manager-55ffbdd8b6-6mvsc" Oct 02 13:15:44 crc kubenswrapper[4724]: I1002 13:15:44.106368 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/2189dcc9-69ab-445f-83a7-2491a7ecb038-webhook-cert\") pod \"horizon-operator-controller-manager-55ffbdd8b6-6mvsc\" (UID: \"2189dcc9-69ab-445f-83a7-2491a7ecb038\") " pod="openstack-operators/horizon-operator-controller-manager-55ffbdd8b6-6mvsc" Oct 02 13:15:44 crc kubenswrapper[4724]: I1002 13:15:44.106437 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/2189dcc9-69ab-445f-83a7-2491a7ecb038-apiservice-cert\") pod \"horizon-operator-controller-manager-55ffbdd8b6-6mvsc\" (UID: \"2189dcc9-69ab-445f-83a7-2491a7ecb038\") " pod="openstack-operators/horizon-operator-controller-manager-55ffbdd8b6-6mvsc" Oct 02 13:15:44 crc kubenswrapper[4724]: I1002 13:15:44.106499 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mk26r\" (UniqueName: \"kubernetes.io/projected/2189dcc9-69ab-445f-83a7-2491a7ecb038-kube-api-access-mk26r\") pod \"horizon-operator-controller-manager-55ffbdd8b6-6mvsc\" (UID: \"2189dcc9-69ab-445f-83a7-2491a7ecb038\") " pod="openstack-operators/horizon-operator-controller-manager-55ffbdd8b6-6mvsc" Oct 02 13:15:44 crc kubenswrapper[4724]: I1002 13:15:44.115144 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/2189dcc9-69ab-445f-83a7-2491a7ecb038-webhook-cert\") pod \"horizon-operator-controller-manager-55ffbdd8b6-6mvsc\" (UID: \"2189dcc9-69ab-445f-83a7-2491a7ecb038\") " pod="openstack-operators/horizon-operator-controller-manager-55ffbdd8b6-6mvsc" Oct 02 13:15:44 crc kubenswrapper[4724]: I1002 13:15:44.115275 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/2189dcc9-69ab-445f-83a7-2491a7ecb038-apiservice-cert\") pod \"horizon-operator-controller-manager-55ffbdd8b6-6mvsc\" (UID: \"2189dcc9-69ab-445f-83a7-2491a7ecb038\") " pod="openstack-operators/horizon-operator-controller-manager-55ffbdd8b6-6mvsc" Oct 02 13:15:44 crc kubenswrapper[4724]: I1002 13:15:44.137308 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mk26r\" (UniqueName: \"kubernetes.io/projected/2189dcc9-69ab-445f-83a7-2491a7ecb038-kube-api-access-mk26r\") pod \"horizon-operator-controller-manager-55ffbdd8b6-6mvsc\" (UID: \"2189dcc9-69ab-445f-83a7-2491a7ecb038\") " pod="openstack-operators/horizon-operator-controller-manager-55ffbdd8b6-6mvsc" Oct 02 13:15:44 crc kubenswrapper[4724]: I1002 13:15:44.178323 4724 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-5947468b68-64ngp"] Oct 02 13:15:44 crc kubenswrapper[4724]: I1002 13:15:44.230288 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-55ffbdd8b6-6mvsc" Oct 02 13:15:44 crc kubenswrapper[4724]: I1002 13:15:44.654274 4724 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-55ffbdd8b6-6mvsc"] Oct 02 13:15:44 crc kubenswrapper[4724]: I1002 13:15:44.663025 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-5947468b68-64ngp" event={"ID":"9e2ec7b8-85ef-400d-ac94-39a733e729aa","Type":"ContainerStarted","Data":"f94019e9d4f2d0547ba02ca63ab2c1e2b8cebe2727871b1196c9a74023d75693"} Oct 02 13:15:45 crc kubenswrapper[4724]: I1002 13:15:45.670904 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-55ffbdd8b6-6mvsc" event={"ID":"2189dcc9-69ab-445f-83a7-2491a7ecb038","Type":"ContainerStarted","Data":"1caf28703aef941965f163aa67992102be9aa120d596ae6a2ba793e1746c88b7"} Oct 02 13:15:47 crc kubenswrapper[4724]: I1002 13:15:47.688820 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-5947468b68-64ngp" event={"ID":"9e2ec7b8-85ef-400d-ac94-39a733e729aa","Type":"ContainerStarted","Data":"eb4e84219e90535c76182c9729d7d3c158a44c6a429383b3cc24ab2a962cf24d"} Oct 02 13:15:47 crc kubenswrapper[4724]: I1002 13:15:47.689790 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-5947468b68-64ngp" event={"ID":"9e2ec7b8-85ef-400d-ac94-39a733e729aa","Type":"ContainerStarted","Data":"be32752b143390a3d94d7f71817a90f50d464c98227dac566942150cff763fc7"} Oct 02 13:15:47 crc kubenswrapper[4724]: I1002 13:15:47.689818 4724 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/swift-operator-controller-manager-5947468b68-64ngp" Oct 02 13:15:47 crc kubenswrapper[4724]: I1002 13:15:47.710349 4724 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/swift-operator-controller-manager-5947468b68-64ngp" podStartSLOduration=2.373173351 podStartE2EDuration="4.710317697s" podCreationTimestamp="2025-10-02 13:15:43 +0000 UTC" firstStartedPulling="2025-10-02 13:15:44.190109978 +0000 UTC m=+1008.644869099" lastFinishedPulling="2025-10-02 13:15:46.527254324 +0000 UTC m=+1010.982013445" observedRunningTime="2025-10-02 13:15:47.706839396 +0000 UTC m=+1012.161598527" watchObservedRunningTime="2025-10-02 13:15:47.710317697 +0000 UTC m=+1012.165076818" Oct 02 13:15:51 crc kubenswrapper[4724]: I1002 13:15:51.735232 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-55ffbdd8b6-6mvsc" event={"ID":"2189dcc9-69ab-445f-83a7-2491a7ecb038","Type":"ContainerStarted","Data":"025faa5b8c002d3c70846f997b9ee1b06e7efa805528d7012a22c27a52af69ef"} Oct 02 13:15:51 crc kubenswrapper[4724]: I1002 13:15:51.736003 4724 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/horizon-operator-controller-manager-55ffbdd8b6-6mvsc" Oct 02 13:15:51 crc kubenswrapper[4724]: I1002 13:15:51.736020 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-55ffbdd8b6-6mvsc" event={"ID":"2189dcc9-69ab-445f-83a7-2491a7ecb038","Type":"ContainerStarted","Data":"471d9370359f6a6b165b045dd9017f30c9ea2600648c16115a0c4f76a0a9ef03"} Oct 02 13:15:51 crc kubenswrapper[4724]: I1002 13:15:51.761953 4724 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/horizon-operator-controller-manager-55ffbdd8b6-6mvsc" podStartSLOduration=2.8852178889999998 podStartE2EDuration="8.761935385s" podCreationTimestamp="2025-10-02 13:15:43 +0000 UTC" firstStartedPulling="2025-10-02 13:15:44.662421026 +0000 UTC m=+1009.117180147" lastFinishedPulling="2025-10-02 13:15:50.539138522 +0000 UTC m=+1014.993897643" observedRunningTime="2025-10-02 13:15:51.759767479 +0000 UTC m=+1016.214526600" watchObservedRunningTime="2025-10-02 13:15:51.761935385 +0000 UTC m=+1016.216694496" Oct 02 13:15:53 crc kubenswrapper[4724]: I1002 13:15:53.708132 4724 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/swift-operator-controller-manager-5947468b68-64ngp" Oct 02 13:15:58 crc kubenswrapper[4724]: I1002 13:15:58.605502 4724 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/keystone-86d476bccd-dbrt4" Oct 02 13:16:00 crc kubenswrapper[4724]: I1002 13:16:00.118820 4724 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/swift-storage-0"] Oct 02 13:16:00 crc kubenswrapper[4724]: I1002 13:16:00.126639 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/swift-storage-0" Oct 02 13:16:00 crc kubenswrapper[4724]: I1002 13:16:00.130368 4724 reflector.go:368] Caches populated for *v1.ConfigMap from object-"glance-kuttl-tests"/"swift-ring-files" Oct 02 13:16:00 crc kubenswrapper[4724]: I1002 13:16:00.130617 4724 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"swift-swift-dockercfg-jnsxs" Oct 02 13:16:00 crc kubenswrapper[4724]: I1002 13:16:00.130861 4724 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"swift-conf" Oct 02 13:16:00 crc kubenswrapper[4724]: I1002 13:16:00.139009 4724 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/swift-storage-0"] Oct 02 13:16:00 crc kubenswrapper[4724]: I1002 13:16:00.139220 4724 reflector.go:368] Caches populated for *v1.ConfigMap from object-"glance-kuttl-tests"/"swift-storage-config-data" Oct 02 13:16:00 crc kubenswrapper[4724]: I1002 13:16:00.184858 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/bee480a1-1b36-4795-befb-3a6d39bab686-cache\") pod \"swift-storage-0\" (UID: \"bee480a1-1b36-4795-befb-3a6d39bab686\") " pod="glance-kuttl-tests/swift-storage-0" Oct 02 13:16:00 crc kubenswrapper[4724]: I1002 13:16:00.184921 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/bee480a1-1b36-4795-befb-3a6d39bab686-etc-swift\") pod \"swift-storage-0\" (UID: \"bee480a1-1b36-4795-befb-3a6d39bab686\") " pod="glance-kuttl-tests/swift-storage-0" Oct 02 13:16:00 crc kubenswrapper[4724]: I1002 13:16:00.185007 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"swift-storage-0\" (UID: \"bee480a1-1b36-4795-befb-3a6d39bab686\") " pod="glance-kuttl-tests/swift-storage-0" Oct 02 13:16:00 crc kubenswrapper[4724]: I1002 13:16:00.185058 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/bee480a1-1b36-4795-befb-3a6d39bab686-lock\") pod \"swift-storage-0\" (UID: \"bee480a1-1b36-4795-befb-3a6d39bab686\") " pod="glance-kuttl-tests/swift-storage-0" Oct 02 13:16:00 crc kubenswrapper[4724]: I1002 13:16:00.185075 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vssfp\" (UniqueName: \"kubernetes.io/projected/bee480a1-1b36-4795-befb-3a6d39bab686-kube-api-access-vssfp\") pod \"swift-storage-0\" (UID: \"bee480a1-1b36-4795-befb-3a6d39bab686\") " pod="glance-kuttl-tests/swift-storage-0" Oct 02 13:16:00 crc kubenswrapper[4724]: I1002 13:16:00.288551 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"swift-storage-0\" (UID: \"bee480a1-1b36-4795-befb-3a6d39bab686\") " pod="glance-kuttl-tests/swift-storage-0" Oct 02 13:16:00 crc kubenswrapper[4724]: I1002 13:16:00.288638 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/bee480a1-1b36-4795-befb-3a6d39bab686-lock\") pod \"swift-storage-0\" (UID: \"bee480a1-1b36-4795-befb-3a6d39bab686\") " pod="glance-kuttl-tests/swift-storage-0" Oct 02 13:16:00 crc kubenswrapper[4724]: I1002 13:16:00.288663 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vssfp\" (UniqueName: \"kubernetes.io/projected/bee480a1-1b36-4795-befb-3a6d39bab686-kube-api-access-vssfp\") pod \"swift-storage-0\" (UID: \"bee480a1-1b36-4795-befb-3a6d39bab686\") " pod="glance-kuttl-tests/swift-storage-0" Oct 02 13:16:00 crc kubenswrapper[4724]: I1002 13:16:00.288802 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/bee480a1-1b36-4795-befb-3a6d39bab686-cache\") pod \"swift-storage-0\" (UID: \"bee480a1-1b36-4795-befb-3a6d39bab686\") " pod="glance-kuttl-tests/swift-storage-0" Oct 02 13:16:00 crc kubenswrapper[4724]: I1002 13:16:00.288860 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/bee480a1-1b36-4795-befb-3a6d39bab686-etc-swift\") pod \"swift-storage-0\" (UID: \"bee480a1-1b36-4795-befb-3a6d39bab686\") " pod="glance-kuttl-tests/swift-storage-0" Oct 02 13:16:00 crc kubenswrapper[4724]: E1002 13:16:00.289097 4724 projected.go:288] Couldn't get configMap glance-kuttl-tests/swift-ring-files: configmap "swift-ring-files" not found Oct 02 13:16:00 crc kubenswrapper[4724]: I1002 13:16:00.289090 4724 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"swift-storage-0\" (UID: \"bee480a1-1b36-4795-befb-3a6d39bab686\") device mount path \"/mnt/openstack/pv04\"" pod="glance-kuttl-tests/swift-storage-0" Oct 02 13:16:00 crc kubenswrapper[4724]: E1002 13:16:00.289115 4724 projected.go:194] Error preparing data for projected volume etc-swift for pod glance-kuttl-tests/swift-storage-0: configmap "swift-ring-files" not found Oct 02 13:16:00 crc kubenswrapper[4724]: E1002 13:16:00.289565 4724 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/bee480a1-1b36-4795-befb-3a6d39bab686-etc-swift podName:bee480a1-1b36-4795-befb-3a6d39bab686 nodeName:}" failed. No retries permitted until 2025-10-02 13:16:00.789516456 +0000 UTC m=+1025.244275577 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/bee480a1-1b36-4795-befb-3a6d39bab686-etc-swift") pod "swift-storage-0" (UID: "bee480a1-1b36-4795-befb-3a6d39bab686") : configmap "swift-ring-files" not found Oct 02 13:16:00 crc kubenswrapper[4724]: I1002 13:16:00.290155 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/bee480a1-1b36-4795-befb-3a6d39bab686-lock\") pod \"swift-storage-0\" (UID: \"bee480a1-1b36-4795-befb-3a6d39bab686\") " pod="glance-kuttl-tests/swift-storage-0" Oct 02 13:16:00 crc kubenswrapper[4724]: I1002 13:16:00.290484 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/bee480a1-1b36-4795-befb-3a6d39bab686-cache\") pod \"swift-storage-0\" (UID: \"bee480a1-1b36-4795-befb-3a6d39bab686\") " pod="glance-kuttl-tests/swift-storage-0" Oct 02 13:16:00 crc kubenswrapper[4724]: I1002 13:16:00.315429 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"swift-storage-0\" (UID: \"bee480a1-1b36-4795-befb-3a6d39bab686\") " pod="glance-kuttl-tests/swift-storage-0" Oct 02 13:16:00 crc kubenswrapper[4724]: I1002 13:16:00.323273 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vssfp\" (UniqueName: \"kubernetes.io/projected/bee480a1-1b36-4795-befb-3a6d39bab686-kube-api-access-vssfp\") pod \"swift-storage-0\" (UID: \"bee480a1-1b36-4795-befb-3a6d39bab686\") " pod="glance-kuttl-tests/swift-storage-0" Oct 02 13:16:00 crc kubenswrapper[4724]: I1002 13:16:00.796257 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/bee480a1-1b36-4795-befb-3a6d39bab686-etc-swift\") pod \"swift-storage-0\" (UID: \"bee480a1-1b36-4795-befb-3a6d39bab686\") " pod="glance-kuttl-tests/swift-storage-0" Oct 02 13:16:00 crc kubenswrapper[4724]: E1002 13:16:00.796527 4724 projected.go:288] Couldn't get configMap glance-kuttl-tests/swift-ring-files: configmap "swift-ring-files" not found Oct 02 13:16:00 crc kubenswrapper[4724]: E1002 13:16:00.796564 4724 projected.go:194] Error preparing data for projected volume etc-swift for pod glance-kuttl-tests/swift-storage-0: configmap "swift-ring-files" not found Oct 02 13:16:00 crc kubenswrapper[4724]: E1002 13:16:00.796623 4724 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/bee480a1-1b36-4795-befb-3a6d39bab686-etc-swift podName:bee480a1-1b36-4795-befb-3a6d39bab686 nodeName:}" failed. No retries permitted until 2025-10-02 13:16:01.796600317 +0000 UTC m=+1026.251359438 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/bee480a1-1b36-4795-befb-3a6d39bab686-etc-swift") pod "swift-storage-0" (UID: "bee480a1-1b36-4795-befb-3a6d39bab686") : configmap "swift-ring-files" not found Oct 02 13:16:01 crc kubenswrapper[4724]: I1002 13:16:01.811425 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/bee480a1-1b36-4795-befb-3a6d39bab686-etc-swift\") pod \"swift-storage-0\" (UID: \"bee480a1-1b36-4795-befb-3a6d39bab686\") " pod="glance-kuttl-tests/swift-storage-0" Oct 02 13:16:01 crc kubenswrapper[4724]: E1002 13:16:01.811655 4724 projected.go:288] Couldn't get configMap glance-kuttl-tests/swift-ring-files: configmap "swift-ring-files" not found Oct 02 13:16:01 crc kubenswrapper[4724]: E1002 13:16:01.811670 4724 projected.go:194] Error preparing data for projected volume etc-swift for pod glance-kuttl-tests/swift-storage-0: configmap "swift-ring-files" not found Oct 02 13:16:01 crc kubenswrapper[4724]: E1002 13:16:01.811723 4724 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/bee480a1-1b36-4795-befb-3a6d39bab686-etc-swift podName:bee480a1-1b36-4795-befb-3a6d39bab686 nodeName:}" failed. No retries permitted until 2025-10-02 13:16:03.81169875 +0000 UTC m=+1028.266457871 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/bee480a1-1b36-4795-befb-3a6d39bab686-etc-swift") pod "swift-storage-0" (UID: "bee480a1-1b36-4795-befb-3a6d39bab686") : configmap "swift-ring-files" not found Oct 02 13:16:01 crc kubenswrapper[4724]: I1002 13:16:01.864898 4724 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/glance-operator-index-7krx9"] Oct 02 13:16:01 crc kubenswrapper[4724]: I1002 13:16:01.866439 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-index-7krx9" Oct 02 13:16:01 crc kubenswrapper[4724]: I1002 13:16:01.870071 4724 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"glance-operator-index-dockercfg-mr7lh" Oct 02 13:16:01 crc kubenswrapper[4724]: I1002 13:16:01.901715 4724 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-index-7krx9"] Oct 02 13:16:01 crc kubenswrapper[4724]: I1002 13:16:01.912645 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rsjvs\" (UniqueName: \"kubernetes.io/projected/8ee33465-a769-4fb1-8c79-5e2d7f46799e-kube-api-access-rsjvs\") pod \"glance-operator-index-7krx9\" (UID: \"8ee33465-a769-4fb1-8c79-5e2d7f46799e\") " pod="openstack-operators/glance-operator-index-7krx9" Oct 02 13:16:02 crc kubenswrapper[4724]: I1002 13:16:02.013951 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rsjvs\" (UniqueName: \"kubernetes.io/projected/8ee33465-a769-4fb1-8c79-5e2d7f46799e-kube-api-access-rsjvs\") pod \"glance-operator-index-7krx9\" (UID: \"8ee33465-a769-4fb1-8c79-5e2d7f46799e\") " pod="openstack-operators/glance-operator-index-7krx9" Oct 02 13:16:02 crc kubenswrapper[4724]: I1002 13:16:02.041560 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rsjvs\" (UniqueName: \"kubernetes.io/projected/8ee33465-a769-4fb1-8c79-5e2d7f46799e-kube-api-access-rsjvs\") pod \"glance-operator-index-7krx9\" (UID: \"8ee33465-a769-4fb1-8c79-5e2d7f46799e\") " pod="openstack-operators/glance-operator-index-7krx9" Oct 02 13:16:02 crc kubenswrapper[4724]: I1002 13:16:02.189609 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-index-7krx9" Oct 02 13:16:02 crc kubenswrapper[4724]: I1002 13:16:02.481933 4724 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-index-7krx9"] Oct 02 13:16:02 crc kubenswrapper[4724]: W1002 13:16:02.490593 4724 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8ee33465_a769_4fb1_8c79_5e2d7f46799e.slice/crio-909ad984d5f0cda7b105bfbfe023b7e298bc0c1a770df92feaa6e351267e812c WatchSource:0}: Error finding container 909ad984d5f0cda7b105bfbfe023b7e298bc0c1a770df92feaa6e351267e812c: Status 404 returned error can't find the container with id 909ad984d5f0cda7b105bfbfe023b7e298bc0c1a770df92feaa6e351267e812c Oct 02 13:16:02 crc kubenswrapper[4724]: I1002 13:16:02.819626 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-index-7krx9" event={"ID":"8ee33465-a769-4fb1-8c79-5e2d7f46799e","Type":"ContainerStarted","Data":"909ad984d5f0cda7b105bfbfe023b7e298bc0c1a770df92feaa6e351267e812c"} Oct 02 13:16:02 crc kubenswrapper[4724]: I1002 13:16:02.925238 4724 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/swift-proxy-59cb459c9f-9zrgv"] Oct 02 13:16:02 crc kubenswrapper[4724]: I1002 13:16:02.927140 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/swift-proxy-59cb459c9f-9zrgv" Oct 02 13:16:02 crc kubenswrapper[4724]: I1002 13:16:02.930558 4724 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"swift-proxy-config-data" Oct 02 13:16:02 crc kubenswrapper[4724]: I1002 13:16:02.939025 4724 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/swift-proxy-59cb459c9f-9zrgv"] Oct 02 13:16:03 crc kubenswrapper[4724]: I1002 13:16:03.030039 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ea2cd8ea-3dbd-4076-b104-762a58eb1868-log-httpd\") pod \"swift-proxy-59cb459c9f-9zrgv\" (UID: \"ea2cd8ea-3dbd-4076-b104-762a58eb1868\") " pod="glance-kuttl-tests/swift-proxy-59cb459c9f-9zrgv" Oct 02 13:16:03 crc kubenswrapper[4724]: I1002 13:16:03.030103 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ea2cd8ea-3dbd-4076-b104-762a58eb1868-run-httpd\") pod \"swift-proxy-59cb459c9f-9zrgv\" (UID: \"ea2cd8ea-3dbd-4076-b104-762a58eb1868\") " pod="glance-kuttl-tests/swift-proxy-59cb459c9f-9zrgv" Oct 02 13:16:03 crc kubenswrapper[4724]: I1002 13:16:03.030148 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/ea2cd8ea-3dbd-4076-b104-762a58eb1868-etc-swift\") pod \"swift-proxy-59cb459c9f-9zrgv\" (UID: \"ea2cd8ea-3dbd-4076-b104-762a58eb1868\") " pod="glance-kuttl-tests/swift-proxy-59cb459c9f-9zrgv" Oct 02 13:16:03 crc kubenswrapper[4724]: I1002 13:16:03.030444 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ea2cd8ea-3dbd-4076-b104-762a58eb1868-config-data\") pod \"swift-proxy-59cb459c9f-9zrgv\" (UID: \"ea2cd8ea-3dbd-4076-b104-762a58eb1868\") " pod="glance-kuttl-tests/swift-proxy-59cb459c9f-9zrgv" Oct 02 13:16:03 crc kubenswrapper[4724]: I1002 13:16:03.030792 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n8q9p\" (UniqueName: \"kubernetes.io/projected/ea2cd8ea-3dbd-4076-b104-762a58eb1868-kube-api-access-n8q9p\") pod \"swift-proxy-59cb459c9f-9zrgv\" (UID: \"ea2cd8ea-3dbd-4076-b104-762a58eb1868\") " pod="glance-kuttl-tests/swift-proxy-59cb459c9f-9zrgv" Oct 02 13:16:03 crc kubenswrapper[4724]: I1002 13:16:03.131861 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ea2cd8ea-3dbd-4076-b104-762a58eb1868-config-data\") pod \"swift-proxy-59cb459c9f-9zrgv\" (UID: \"ea2cd8ea-3dbd-4076-b104-762a58eb1868\") " pod="glance-kuttl-tests/swift-proxy-59cb459c9f-9zrgv" Oct 02 13:16:03 crc kubenswrapper[4724]: I1002 13:16:03.131948 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n8q9p\" (UniqueName: \"kubernetes.io/projected/ea2cd8ea-3dbd-4076-b104-762a58eb1868-kube-api-access-n8q9p\") pod \"swift-proxy-59cb459c9f-9zrgv\" (UID: \"ea2cd8ea-3dbd-4076-b104-762a58eb1868\") " pod="glance-kuttl-tests/swift-proxy-59cb459c9f-9zrgv" Oct 02 13:16:03 crc kubenswrapper[4724]: I1002 13:16:03.132010 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ea2cd8ea-3dbd-4076-b104-762a58eb1868-log-httpd\") pod \"swift-proxy-59cb459c9f-9zrgv\" (UID: \"ea2cd8ea-3dbd-4076-b104-762a58eb1868\") " pod="glance-kuttl-tests/swift-proxy-59cb459c9f-9zrgv" Oct 02 13:16:03 crc kubenswrapper[4724]: I1002 13:16:03.132042 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ea2cd8ea-3dbd-4076-b104-762a58eb1868-run-httpd\") pod \"swift-proxy-59cb459c9f-9zrgv\" (UID: \"ea2cd8ea-3dbd-4076-b104-762a58eb1868\") " pod="glance-kuttl-tests/swift-proxy-59cb459c9f-9zrgv" Oct 02 13:16:03 crc kubenswrapper[4724]: I1002 13:16:03.132102 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/ea2cd8ea-3dbd-4076-b104-762a58eb1868-etc-swift\") pod \"swift-proxy-59cb459c9f-9zrgv\" (UID: \"ea2cd8ea-3dbd-4076-b104-762a58eb1868\") " pod="glance-kuttl-tests/swift-proxy-59cb459c9f-9zrgv" Oct 02 13:16:03 crc kubenswrapper[4724]: E1002 13:16:03.132331 4724 projected.go:288] Couldn't get configMap glance-kuttl-tests/swift-ring-files: configmap "swift-ring-files" not found Oct 02 13:16:03 crc kubenswrapper[4724]: E1002 13:16:03.132350 4724 projected.go:194] Error preparing data for projected volume etc-swift for pod glance-kuttl-tests/swift-proxy-59cb459c9f-9zrgv: configmap "swift-ring-files" not found Oct 02 13:16:03 crc kubenswrapper[4724]: E1002 13:16:03.132408 4724 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/ea2cd8ea-3dbd-4076-b104-762a58eb1868-etc-swift podName:ea2cd8ea-3dbd-4076-b104-762a58eb1868 nodeName:}" failed. No retries permitted until 2025-10-02 13:16:03.632386006 +0000 UTC m=+1028.087145127 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/ea2cd8ea-3dbd-4076-b104-762a58eb1868-etc-swift") pod "swift-proxy-59cb459c9f-9zrgv" (UID: "ea2cd8ea-3dbd-4076-b104-762a58eb1868") : configmap "swift-ring-files" not found Oct 02 13:16:03 crc kubenswrapper[4724]: I1002 13:16:03.133302 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ea2cd8ea-3dbd-4076-b104-762a58eb1868-log-httpd\") pod \"swift-proxy-59cb459c9f-9zrgv\" (UID: \"ea2cd8ea-3dbd-4076-b104-762a58eb1868\") " pod="glance-kuttl-tests/swift-proxy-59cb459c9f-9zrgv" Oct 02 13:16:03 crc kubenswrapper[4724]: I1002 13:16:03.134112 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ea2cd8ea-3dbd-4076-b104-762a58eb1868-run-httpd\") pod \"swift-proxy-59cb459c9f-9zrgv\" (UID: \"ea2cd8ea-3dbd-4076-b104-762a58eb1868\") " pod="glance-kuttl-tests/swift-proxy-59cb459c9f-9zrgv" Oct 02 13:16:03 crc kubenswrapper[4724]: I1002 13:16:03.138953 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ea2cd8ea-3dbd-4076-b104-762a58eb1868-config-data\") pod \"swift-proxy-59cb459c9f-9zrgv\" (UID: \"ea2cd8ea-3dbd-4076-b104-762a58eb1868\") " pod="glance-kuttl-tests/swift-proxy-59cb459c9f-9zrgv" Oct 02 13:16:03 crc kubenswrapper[4724]: I1002 13:16:03.159497 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n8q9p\" (UniqueName: \"kubernetes.io/projected/ea2cd8ea-3dbd-4076-b104-762a58eb1868-kube-api-access-n8q9p\") pod \"swift-proxy-59cb459c9f-9zrgv\" (UID: \"ea2cd8ea-3dbd-4076-b104-762a58eb1868\") " pod="glance-kuttl-tests/swift-proxy-59cb459c9f-9zrgv" Oct 02 13:16:03 crc kubenswrapper[4724]: I1002 13:16:03.639944 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/ea2cd8ea-3dbd-4076-b104-762a58eb1868-etc-swift\") pod \"swift-proxy-59cb459c9f-9zrgv\" (UID: \"ea2cd8ea-3dbd-4076-b104-762a58eb1868\") " pod="glance-kuttl-tests/swift-proxy-59cb459c9f-9zrgv" Oct 02 13:16:03 crc kubenswrapper[4724]: E1002 13:16:03.640232 4724 projected.go:288] Couldn't get configMap glance-kuttl-tests/swift-ring-files: configmap "swift-ring-files" not found Oct 02 13:16:03 crc kubenswrapper[4724]: E1002 13:16:03.640618 4724 projected.go:194] Error preparing data for projected volume etc-swift for pod glance-kuttl-tests/swift-proxy-59cb459c9f-9zrgv: configmap "swift-ring-files" not found Oct 02 13:16:03 crc kubenswrapper[4724]: E1002 13:16:03.640712 4724 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/ea2cd8ea-3dbd-4076-b104-762a58eb1868-etc-swift podName:ea2cd8ea-3dbd-4076-b104-762a58eb1868 nodeName:}" failed. No retries permitted until 2025-10-02 13:16:04.640679388 +0000 UTC m=+1029.095438509 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/ea2cd8ea-3dbd-4076-b104-762a58eb1868-etc-swift") pod "swift-proxy-59cb459c9f-9zrgv" (UID: "ea2cd8ea-3dbd-4076-b104-762a58eb1868") : configmap "swift-ring-files" not found Oct 02 13:16:03 crc kubenswrapper[4724]: I1002 13:16:03.843937 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/bee480a1-1b36-4795-befb-3a6d39bab686-etc-swift\") pod \"swift-storage-0\" (UID: \"bee480a1-1b36-4795-befb-3a6d39bab686\") " pod="glance-kuttl-tests/swift-storage-0" Oct 02 13:16:03 crc kubenswrapper[4724]: E1002 13:16:03.844255 4724 projected.go:288] Couldn't get configMap glance-kuttl-tests/swift-ring-files: configmap "swift-ring-files" not found Oct 02 13:16:03 crc kubenswrapper[4724]: E1002 13:16:03.844300 4724 projected.go:194] Error preparing data for projected volume etc-swift for pod glance-kuttl-tests/swift-storage-0: configmap "swift-ring-files" not found Oct 02 13:16:03 crc kubenswrapper[4724]: E1002 13:16:03.844408 4724 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/bee480a1-1b36-4795-befb-3a6d39bab686-etc-swift podName:bee480a1-1b36-4795-befb-3a6d39bab686 nodeName:}" failed. No retries permitted until 2025-10-02 13:16:07.844373364 +0000 UTC m=+1032.299132485 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/bee480a1-1b36-4795-befb-3a6d39bab686-etc-swift") pod "swift-storage-0" (UID: "bee480a1-1b36-4795-befb-3a6d39bab686") : configmap "swift-ring-files" not found Oct 02 13:16:04 crc kubenswrapper[4724]: I1002 13:16:04.186199 4724 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/swift-ring-rebalance-ml7nk"] Oct 02 13:16:04 crc kubenswrapper[4724]: I1002 13:16:04.187264 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/swift-ring-rebalance-ml7nk" Oct 02 13:16:04 crc kubenswrapper[4724]: I1002 13:16:04.190552 4724 reflector.go:368] Caches populated for *v1.ConfigMap from object-"glance-kuttl-tests"/"swift-ring-config-data" Oct 02 13:16:04 crc kubenswrapper[4724]: I1002 13:16:04.190881 4724 reflector.go:368] Caches populated for *v1.ConfigMap from object-"glance-kuttl-tests"/"swift-ring-scripts" Oct 02 13:16:04 crc kubenswrapper[4724]: I1002 13:16:04.201449 4724 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/swift-ring-rebalance-ml7nk"] Oct 02 13:16:04 crc kubenswrapper[4724]: I1002 13:16:04.238411 4724 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/horizon-operator-controller-manager-55ffbdd8b6-6mvsc" Oct 02 13:16:04 crc kubenswrapper[4724]: I1002 13:16:04.251767 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/f0e00bd8-bde9-44dd-b71e-ec362b86bd23-swiftconf\") pod \"swift-ring-rebalance-ml7nk\" (UID: \"f0e00bd8-bde9-44dd-b71e-ec362b86bd23\") " pod="glance-kuttl-tests/swift-ring-rebalance-ml7nk" Oct 02 13:16:04 crc kubenswrapper[4724]: I1002 13:16:04.251910 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kzf8j\" (UniqueName: \"kubernetes.io/projected/f0e00bd8-bde9-44dd-b71e-ec362b86bd23-kube-api-access-kzf8j\") pod \"swift-ring-rebalance-ml7nk\" (UID: \"f0e00bd8-bde9-44dd-b71e-ec362b86bd23\") " pod="glance-kuttl-tests/swift-ring-rebalance-ml7nk" Oct 02 13:16:04 crc kubenswrapper[4724]: I1002 13:16:04.251948 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/f0e00bd8-bde9-44dd-b71e-ec362b86bd23-ring-data-devices\") pod \"swift-ring-rebalance-ml7nk\" (UID: \"f0e00bd8-bde9-44dd-b71e-ec362b86bd23\") " pod="glance-kuttl-tests/swift-ring-rebalance-ml7nk" Oct 02 13:16:04 crc kubenswrapper[4724]: I1002 13:16:04.251991 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/f0e00bd8-bde9-44dd-b71e-ec362b86bd23-etc-swift\") pod \"swift-ring-rebalance-ml7nk\" (UID: \"f0e00bd8-bde9-44dd-b71e-ec362b86bd23\") " pod="glance-kuttl-tests/swift-ring-rebalance-ml7nk" Oct 02 13:16:04 crc kubenswrapper[4724]: I1002 13:16:04.252157 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/f0e00bd8-bde9-44dd-b71e-ec362b86bd23-dispersionconf\") pod \"swift-ring-rebalance-ml7nk\" (UID: \"f0e00bd8-bde9-44dd-b71e-ec362b86bd23\") " pod="glance-kuttl-tests/swift-ring-rebalance-ml7nk" Oct 02 13:16:04 crc kubenswrapper[4724]: I1002 13:16:04.252216 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f0e00bd8-bde9-44dd-b71e-ec362b86bd23-scripts\") pod \"swift-ring-rebalance-ml7nk\" (UID: \"f0e00bd8-bde9-44dd-b71e-ec362b86bd23\") " pod="glance-kuttl-tests/swift-ring-rebalance-ml7nk" Oct 02 13:16:04 crc kubenswrapper[4724]: I1002 13:16:04.353233 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kzf8j\" (UniqueName: \"kubernetes.io/projected/f0e00bd8-bde9-44dd-b71e-ec362b86bd23-kube-api-access-kzf8j\") pod \"swift-ring-rebalance-ml7nk\" (UID: \"f0e00bd8-bde9-44dd-b71e-ec362b86bd23\") " pod="glance-kuttl-tests/swift-ring-rebalance-ml7nk" Oct 02 13:16:04 crc kubenswrapper[4724]: I1002 13:16:04.353316 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/f0e00bd8-bde9-44dd-b71e-ec362b86bd23-ring-data-devices\") pod \"swift-ring-rebalance-ml7nk\" (UID: \"f0e00bd8-bde9-44dd-b71e-ec362b86bd23\") " pod="glance-kuttl-tests/swift-ring-rebalance-ml7nk" Oct 02 13:16:04 crc kubenswrapper[4724]: I1002 13:16:04.353376 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/f0e00bd8-bde9-44dd-b71e-ec362b86bd23-etc-swift\") pod \"swift-ring-rebalance-ml7nk\" (UID: \"f0e00bd8-bde9-44dd-b71e-ec362b86bd23\") " pod="glance-kuttl-tests/swift-ring-rebalance-ml7nk" Oct 02 13:16:04 crc kubenswrapper[4724]: I1002 13:16:04.353429 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/f0e00bd8-bde9-44dd-b71e-ec362b86bd23-dispersionconf\") pod \"swift-ring-rebalance-ml7nk\" (UID: \"f0e00bd8-bde9-44dd-b71e-ec362b86bd23\") " pod="glance-kuttl-tests/swift-ring-rebalance-ml7nk" Oct 02 13:16:04 crc kubenswrapper[4724]: I1002 13:16:04.353494 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f0e00bd8-bde9-44dd-b71e-ec362b86bd23-scripts\") pod \"swift-ring-rebalance-ml7nk\" (UID: \"f0e00bd8-bde9-44dd-b71e-ec362b86bd23\") " pod="glance-kuttl-tests/swift-ring-rebalance-ml7nk" Oct 02 13:16:04 crc kubenswrapper[4724]: I1002 13:16:04.353525 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/f0e00bd8-bde9-44dd-b71e-ec362b86bd23-swiftconf\") pod \"swift-ring-rebalance-ml7nk\" (UID: \"f0e00bd8-bde9-44dd-b71e-ec362b86bd23\") " pod="glance-kuttl-tests/swift-ring-rebalance-ml7nk" Oct 02 13:16:04 crc kubenswrapper[4724]: I1002 13:16:04.355135 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/f0e00bd8-bde9-44dd-b71e-ec362b86bd23-etc-swift\") pod \"swift-ring-rebalance-ml7nk\" (UID: \"f0e00bd8-bde9-44dd-b71e-ec362b86bd23\") " pod="glance-kuttl-tests/swift-ring-rebalance-ml7nk" Oct 02 13:16:04 crc kubenswrapper[4724]: I1002 13:16:04.355645 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/f0e00bd8-bde9-44dd-b71e-ec362b86bd23-ring-data-devices\") pod \"swift-ring-rebalance-ml7nk\" (UID: \"f0e00bd8-bde9-44dd-b71e-ec362b86bd23\") " pod="glance-kuttl-tests/swift-ring-rebalance-ml7nk" Oct 02 13:16:04 crc kubenswrapper[4724]: I1002 13:16:04.355733 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f0e00bd8-bde9-44dd-b71e-ec362b86bd23-scripts\") pod \"swift-ring-rebalance-ml7nk\" (UID: \"f0e00bd8-bde9-44dd-b71e-ec362b86bd23\") " pod="glance-kuttl-tests/swift-ring-rebalance-ml7nk" Oct 02 13:16:04 crc kubenswrapper[4724]: I1002 13:16:04.362963 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/f0e00bd8-bde9-44dd-b71e-ec362b86bd23-dispersionconf\") pod \"swift-ring-rebalance-ml7nk\" (UID: \"f0e00bd8-bde9-44dd-b71e-ec362b86bd23\") " pod="glance-kuttl-tests/swift-ring-rebalance-ml7nk" Oct 02 13:16:04 crc kubenswrapper[4724]: I1002 13:16:04.363445 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/f0e00bd8-bde9-44dd-b71e-ec362b86bd23-swiftconf\") pod \"swift-ring-rebalance-ml7nk\" (UID: \"f0e00bd8-bde9-44dd-b71e-ec362b86bd23\") " pod="glance-kuttl-tests/swift-ring-rebalance-ml7nk" Oct 02 13:16:04 crc kubenswrapper[4724]: I1002 13:16:04.382883 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kzf8j\" (UniqueName: \"kubernetes.io/projected/f0e00bd8-bde9-44dd-b71e-ec362b86bd23-kube-api-access-kzf8j\") pod \"swift-ring-rebalance-ml7nk\" (UID: \"f0e00bd8-bde9-44dd-b71e-ec362b86bd23\") " pod="glance-kuttl-tests/swift-ring-rebalance-ml7nk" Oct 02 13:16:04 crc kubenswrapper[4724]: I1002 13:16:04.504301 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/swift-ring-rebalance-ml7nk" Oct 02 13:16:04 crc kubenswrapper[4724]: I1002 13:16:04.677329 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/ea2cd8ea-3dbd-4076-b104-762a58eb1868-etc-swift\") pod \"swift-proxy-59cb459c9f-9zrgv\" (UID: \"ea2cd8ea-3dbd-4076-b104-762a58eb1868\") " pod="glance-kuttl-tests/swift-proxy-59cb459c9f-9zrgv" Oct 02 13:16:04 crc kubenswrapper[4724]: E1002 13:16:04.677558 4724 projected.go:288] Couldn't get configMap glance-kuttl-tests/swift-ring-files: configmap "swift-ring-files" not found Oct 02 13:16:04 crc kubenswrapper[4724]: E1002 13:16:04.677572 4724 projected.go:194] Error preparing data for projected volume etc-swift for pod glance-kuttl-tests/swift-proxy-59cb459c9f-9zrgv: configmap "swift-ring-files" not found Oct 02 13:16:04 crc kubenswrapper[4724]: E1002 13:16:04.677622 4724 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/ea2cd8ea-3dbd-4076-b104-762a58eb1868-etc-swift podName:ea2cd8ea-3dbd-4076-b104-762a58eb1868 nodeName:}" failed. No retries permitted until 2025-10-02 13:16:06.677603219 +0000 UTC m=+1031.132362330 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/ea2cd8ea-3dbd-4076-b104-762a58eb1868-etc-swift") pod "swift-proxy-59cb459c9f-9zrgv" (UID: "ea2cd8ea-3dbd-4076-b104-762a58eb1868") : configmap "swift-ring-files" not found Oct 02 13:16:04 crc kubenswrapper[4724]: I1002 13:16:04.835922 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-index-7krx9" event={"ID":"8ee33465-a769-4fb1-8c79-5e2d7f46799e","Type":"ContainerStarted","Data":"328cc651cdec0c619eb484b32caa6049d58e833b85b6af55196b9f65208c33a2"} Oct 02 13:16:04 crc kubenswrapper[4724]: I1002 13:16:04.859563 4724 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/glance-operator-index-7krx9" podStartSLOduration=2.640317878 podStartE2EDuration="3.85952316s" podCreationTimestamp="2025-10-02 13:16:01 +0000 UTC" firstStartedPulling="2025-10-02 13:16:02.493451224 +0000 UTC m=+1026.948210345" lastFinishedPulling="2025-10-02 13:16:03.712656506 +0000 UTC m=+1028.167415627" observedRunningTime="2025-10-02 13:16:04.854088219 +0000 UTC m=+1029.308847340" watchObservedRunningTime="2025-10-02 13:16:04.85952316 +0000 UTC m=+1029.314282281" Oct 02 13:16:05 crc kubenswrapper[4724]: I1002 13:16:05.046918 4724 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/swift-ring-rebalance-ml7nk"] Oct 02 13:16:05 crc kubenswrapper[4724]: I1002 13:16:05.849361 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/swift-ring-rebalance-ml7nk" event={"ID":"f0e00bd8-bde9-44dd-b71e-ec362b86bd23","Type":"ContainerStarted","Data":"0d1eefbe16f29b69f59ce185f7a97bfd560aa3667b5d6db4d2a48c39e21f5c13"} Oct 02 13:16:06 crc kubenswrapper[4724]: I1002 13:16:06.254007 4724 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/glance-operator-index-7krx9"] Oct 02 13:16:06 crc kubenswrapper[4724]: I1002 13:16:06.715034 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/ea2cd8ea-3dbd-4076-b104-762a58eb1868-etc-swift\") pod \"swift-proxy-59cb459c9f-9zrgv\" (UID: \"ea2cd8ea-3dbd-4076-b104-762a58eb1868\") " pod="glance-kuttl-tests/swift-proxy-59cb459c9f-9zrgv" Oct 02 13:16:06 crc kubenswrapper[4724]: E1002 13:16:06.715287 4724 projected.go:288] Couldn't get configMap glance-kuttl-tests/swift-ring-files: configmap "swift-ring-files" not found Oct 02 13:16:06 crc kubenswrapper[4724]: E1002 13:16:06.715325 4724 projected.go:194] Error preparing data for projected volume etc-swift for pod glance-kuttl-tests/swift-proxy-59cb459c9f-9zrgv: configmap "swift-ring-files" not found Oct 02 13:16:06 crc kubenswrapper[4724]: E1002 13:16:06.715403 4724 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/ea2cd8ea-3dbd-4076-b104-762a58eb1868-etc-swift podName:ea2cd8ea-3dbd-4076-b104-762a58eb1868 nodeName:}" failed. No retries permitted until 2025-10-02 13:16:10.715375864 +0000 UTC m=+1035.170134985 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/ea2cd8ea-3dbd-4076-b104-762a58eb1868-etc-swift") pod "swift-proxy-59cb459c9f-9zrgv" (UID: "ea2cd8ea-3dbd-4076-b104-762a58eb1868") : configmap "swift-ring-files" not found Oct 02 13:16:06 crc kubenswrapper[4724]: I1002 13:16:06.855680 4724 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/glance-operator-index-d8sdv"] Oct 02 13:16:06 crc kubenswrapper[4724]: I1002 13:16:06.857102 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-index-d8sdv" Oct 02 13:16:06 crc kubenswrapper[4724]: I1002 13:16:06.865895 4724 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/glance-operator-index-7krx9" podUID="8ee33465-a769-4fb1-8c79-5e2d7f46799e" containerName="registry-server" containerID="cri-o://328cc651cdec0c619eb484b32caa6049d58e833b85b6af55196b9f65208c33a2" gracePeriod=2 Oct 02 13:16:06 crc kubenswrapper[4724]: I1002 13:16:06.893765 4724 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-index-d8sdv"] Oct 02 13:16:06 crc kubenswrapper[4724]: I1002 13:16:06.918793 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mgfgs\" (UniqueName: \"kubernetes.io/projected/4520d4cc-dd9a-4dd9-b506-f41c6fc537ed-kube-api-access-mgfgs\") pod \"glance-operator-index-d8sdv\" (UID: \"4520d4cc-dd9a-4dd9-b506-f41c6fc537ed\") " pod="openstack-operators/glance-operator-index-d8sdv" Oct 02 13:16:07 crc kubenswrapper[4724]: I1002 13:16:07.020833 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mgfgs\" (UniqueName: \"kubernetes.io/projected/4520d4cc-dd9a-4dd9-b506-f41c6fc537ed-kube-api-access-mgfgs\") pod \"glance-operator-index-d8sdv\" (UID: \"4520d4cc-dd9a-4dd9-b506-f41c6fc537ed\") " pod="openstack-operators/glance-operator-index-d8sdv" Oct 02 13:16:07 crc kubenswrapper[4724]: I1002 13:16:07.044660 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mgfgs\" (UniqueName: \"kubernetes.io/projected/4520d4cc-dd9a-4dd9-b506-f41c6fc537ed-kube-api-access-mgfgs\") pod \"glance-operator-index-d8sdv\" (UID: \"4520d4cc-dd9a-4dd9-b506-f41c6fc537ed\") " pod="openstack-operators/glance-operator-index-d8sdv" Oct 02 13:16:07 crc kubenswrapper[4724]: I1002 13:16:07.201341 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-index-d8sdv" Oct 02 13:16:07 crc kubenswrapper[4724]: I1002 13:16:07.894050 4724 generic.go:334] "Generic (PLEG): container finished" podID="8ee33465-a769-4fb1-8c79-5e2d7f46799e" containerID="328cc651cdec0c619eb484b32caa6049d58e833b85b6af55196b9f65208c33a2" exitCode=0 Oct 02 13:16:07 crc kubenswrapper[4724]: I1002 13:16:07.894583 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-index-7krx9" event={"ID":"8ee33465-a769-4fb1-8c79-5e2d7f46799e","Type":"ContainerDied","Data":"328cc651cdec0c619eb484b32caa6049d58e833b85b6af55196b9f65208c33a2"} Oct 02 13:16:07 crc kubenswrapper[4724]: I1002 13:16:07.939230 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/bee480a1-1b36-4795-befb-3a6d39bab686-etc-swift\") pod \"swift-storage-0\" (UID: \"bee480a1-1b36-4795-befb-3a6d39bab686\") " pod="glance-kuttl-tests/swift-storage-0" Oct 02 13:16:07 crc kubenswrapper[4724]: E1002 13:16:07.939451 4724 projected.go:288] Couldn't get configMap glance-kuttl-tests/swift-ring-files: configmap "swift-ring-files" not found Oct 02 13:16:07 crc kubenswrapper[4724]: E1002 13:16:07.939470 4724 projected.go:194] Error preparing data for projected volume etc-swift for pod glance-kuttl-tests/swift-storage-0: configmap "swift-ring-files" not found Oct 02 13:16:07 crc kubenswrapper[4724]: E1002 13:16:07.939553 4724 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/bee480a1-1b36-4795-befb-3a6d39bab686-etc-swift podName:bee480a1-1b36-4795-befb-3a6d39bab686 nodeName:}" failed. No retries permitted until 2025-10-02 13:16:15.939513884 +0000 UTC m=+1040.394273005 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/bee480a1-1b36-4795-befb-3a6d39bab686-etc-swift") pod "swift-storage-0" (UID: "bee480a1-1b36-4795-befb-3a6d39bab686") : configmap "swift-ring-files" not found Oct 02 13:16:10 crc kubenswrapper[4724]: I1002 13:16:10.802642 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/ea2cd8ea-3dbd-4076-b104-762a58eb1868-etc-swift\") pod \"swift-proxy-59cb459c9f-9zrgv\" (UID: \"ea2cd8ea-3dbd-4076-b104-762a58eb1868\") " pod="glance-kuttl-tests/swift-proxy-59cb459c9f-9zrgv" Oct 02 13:16:10 crc kubenswrapper[4724]: E1002 13:16:10.802941 4724 projected.go:288] Couldn't get configMap glance-kuttl-tests/swift-ring-files: configmap "swift-ring-files" not found Oct 02 13:16:10 crc kubenswrapper[4724]: E1002 13:16:10.803196 4724 projected.go:194] Error preparing data for projected volume etc-swift for pod glance-kuttl-tests/swift-proxy-59cb459c9f-9zrgv: configmap "swift-ring-files" not found Oct 02 13:16:10 crc kubenswrapper[4724]: E1002 13:16:10.803276 4724 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/ea2cd8ea-3dbd-4076-b104-762a58eb1868-etc-swift podName:ea2cd8ea-3dbd-4076-b104-762a58eb1868 nodeName:}" failed. No retries permitted until 2025-10-02 13:16:18.803250775 +0000 UTC m=+1043.258009886 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/ea2cd8ea-3dbd-4076-b104-762a58eb1868-etc-swift") pod "swift-proxy-59cb459c9f-9zrgv" (UID: "ea2cd8ea-3dbd-4076-b104-762a58eb1868") : configmap "swift-ring-files" not found Oct 02 13:16:12 crc kubenswrapper[4724]: I1002 13:16:12.189734 4724 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/glance-operator-index-7krx9" Oct 02 13:16:12 crc kubenswrapper[4724]: I1002 13:16:12.256701 4724 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-index-7krx9" Oct 02 13:16:12 crc kubenswrapper[4724]: I1002 13:16:12.334299 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rsjvs\" (UniqueName: \"kubernetes.io/projected/8ee33465-a769-4fb1-8c79-5e2d7f46799e-kube-api-access-rsjvs\") pod \"8ee33465-a769-4fb1-8c79-5e2d7f46799e\" (UID: \"8ee33465-a769-4fb1-8c79-5e2d7f46799e\") " Oct 02 13:16:12 crc kubenswrapper[4724]: I1002 13:16:12.354795 4724 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8ee33465-a769-4fb1-8c79-5e2d7f46799e-kube-api-access-rsjvs" (OuterVolumeSpecName: "kube-api-access-rsjvs") pod "8ee33465-a769-4fb1-8c79-5e2d7f46799e" (UID: "8ee33465-a769-4fb1-8c79-5e2d7f46799e"). InnerVolumeSpecName "kube-api-access-rsjvs". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 13:16:12 crc kubenswrapper[4724]: I1002 13:16:12.453394 4724 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rsjvs\" (UniqueName: \"kubernetes.io/projected/8ee33465-a769-4fb1-8c79-5e2d7f46799e-kube-api-access-rsjvs\") on node \"crc\" DevicePath \"\"" Oct 02 13:16:12 crc kubenswrapper[4724]: I1002 13:16:12.516763 4724 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-index-d8sdv"] Oct 02 13:16:12 crc kubenswrapper[4724]: I1002 13:16:12.945758 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/swift-ring-rebalance-ml7nk" event={"ID":"f0e00bd8-bde9-44dd-b71e-ec362b86bd23","Type":"ContainerStarted","Data":"67ae6e463644e82d1fd5c0b1cb549872a354f1b2284f1085869bca85c4cd2af3"} Oct 02 13:16:12 crc kubenswrapper[4724]: I1002 13:16:12.947245 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-index-7krx9" event={"ID":"8ee33465-a769-4fb1-8c79-5e2d7f46799e","Type":"ContainerDied","Data":"909ad984d5f0cda7b105bfbfe023b7e298bc0c1a770df92feaa6e351267e812c"} Oct 02 13:16:12 crc kubenswrapper[4724]: I1002 13:16:12.947290 4724 scope.go:117] "RemoveContainer" containerID="328cc651cdec0c619eb484b32caa6049d58e833b85b6af55196b9f65208c33a2" Oct 02 13:16:12 crc kubenswrapper[4724]: I1002 13:16:12.947288 4724 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-index-7krx9" Oct 02 13:16:12 crc kubenswrapper[4724]: I1002 13:16:12.949824 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-index-d8sdv" event={"ID":"4520d4cc-dd9a-4dd9-b506-f41c6fc537ed","Type":"ContainerStarted","Data":"bc6269b77ef0dcf1e3259523e453e7e31f064741fde2d4003d7dbf52007f64f6"} Oct 02 13:16:12 crc kubenswrapper[4724]: I1002 13:16:12.949863 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-index-d8sdv" event={"ID":"4520d4cc-dd9a-4dd9-b506-f41c6fc537ed","Type":"ContainerStarted","Data":"aae3e6d03e23d2da5967a74ca77c730ccc696e179031e0e5b606642c6e862052"} Oct 02 13:16:12 crc kubenswrapper[4724]: I1002 13:16:12.973942 4724 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="glance-kuttl-tests/swift-ring-rebalance-ml7nk" podStartSLOduration=1.9021465819999999 podStartE2EDuration="8.973917924s" podCreationTimestamp="2025-10-02 13:16:04 +0000 UTC" firstStartedPulling="2025-10-02 13:16:05.062503568 +0000 UTC m=+1029.517262689" lastFinishedPulling="2025-10-02 13:16:12.13427491 +0000 UTC m=+1036.589034031" observedRunningTime="2025-10-02 13:16:12.971890182 +0000 UTC m=+1037.426649313" watchObservedRunningTime="2025-10-02 13:16:12.973917924 +0000 UTC m=+1037.428677045" Oct 02 13:16:13 crc kubenswrapper[4724]: I1002 13:16:13.005416 4724 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/glance-operator-index-d8sdv" podStartSLOduration=6.9602779 podStartE2EDuration="7.005398183s" podCreationTimestamp="2025-10-02 13:16:06 +0000 UTC" firstStartedPulling="2025-10-02 13:16:12.527518806 +0000 UTC m=+1036.982277927" lastFinishedPulling="2025-10-02 13:16:12.572639089 +0000 UTC m=+1037.027398210" observedRunningTime="2025-10-02 13:16:13.000340591 +0000 UTC m=+1037.455099722" watchObservedRunningTime="2025-10-02 13:16:13.005398183 +0000 UTC m=+1037.460157304" Oct 02 13:16:13 crc kubenswrapper[4724]: I1002 13:16:13.017934 4724 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/glance-operator-index-7krx9"] Oct 02 13:16:13 crc kubenswrapper[4724]: I1002 13:16:13.027187 4724 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/glance-operator-index-7krx9"] Oct 02 13:16:14 crc kubenswrapper[4724]: I1002 13:16:14.323774 4724 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8ee33465-a769-4fb1-8c79-5e2d7f46799e" path="/var/lib/kubelet/pods/8ee33465-a769-4fb1-8c79-5e2d7f46799e/volumes" Oct 02 13:16:16 crc kubenswrapper[4724]: I1002 13:16:16.009359 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/bee480a1-1b36-4795-befb-3a6d39bab686-etc-swift\") pod \"swift-storage-0\" (UID: \"bee480a1-1b36-4795-befb-3a6d39bab686\") " pod="glance-kuttl-tests/swift-storage-0" Oct 02 13:16:16 crc kubenswrapper[4724]: E1002 13:16:16.009649 4724 projected.go:288] Couldn't get configMap glance-kuttl-tests/swift-ring-files: configmap "swift-ring-files" not found Oct 02 13:16:16 crc kubenswrapper[4724]: E1002 13:16:16.011012 4724 projected.go:194] Error preparing data for projected volume etc-swift for pod glance-kuttl-tests/swift-storage-0: configmap "swift-ring-files" not found Oct 02 13:16:16 crc kubenswrapper[4724]: E1002 13:16:16.011206 4724 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/bee480a1-1b36-4795-befb-3a6d39bab686-etc-swift podName:bee480a1-1b36-4795-befb-3a6d39bab686 nodeName:}" failed. No retries permitted until 2025-10-02 13:16:32.011180968 +0000 UTC m=+1056.465940089 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/bee480a1-1b36-4795-befb-3a6d39bab686-etc-swift") pod "swift-storage-0" (UID: "bee480a1-1b36-4795-befb-3a6d39bab686") : configmap "swift-ring-files" not found Oct 02 13:16:17 crc kubenswrapper[4724]: I1002 13:16:17.201549 4724 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/glance-operator-index-d8sdv" Oct 02 13:16:17 crc kubenswrapper[4724]: I1002 13:16:17.201621 4724 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/glance-operator-index-d8sdv" Oct 02 13:16:17 crc kubenswrapper[4724]: I1002 13:16:17.238569 4724 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/glance-operator-index-d8sdv" Oct 02 13:16:18 crc kubenswrapper[4724]: I1002 13:16:18.021904 4724 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/glance-operator-index-d8sdv" Oct 02 13:16:18 crc kubenswrapper[4724]: I1002 13:16:18.871757 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/ea2cd8ea-3dbd-4076-b104-762a58eb1868-etc-swift\") pod \"swift-proxy-59cb459c9f-9zrgv\" (UID: \"ea2cd8ea-3dbd-4076-b104-762a58eb1868\") " pod="glance-kuttl-tests/swift-proxy-59cb459c9f-9zrgv" Oct 02 13:16:18 crc kubenswrapper[4724]: E1002 13:16:18.871957 4724 projected.go:288] Couldn't get configMap glance-kuttl-tests/swift-ring-files: configmap "swift-ring-files" not found Oct 02 13:16:18 crc kubenswrapper[4724]: E1002 13:16:18.872340 4724 projected.go:194] Error preparing data for projected volume etc-swift for pod glance-kuttl-tests/swift-proxy-59cb459c9f-9zrgv: configmap "swift-ring-files" not found Oct 02 13:16:18 crc kubenswrapper[4724]: E1002 13:16:18.872418 4724 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/ea2cd8ea-3dbd-4076-b104-762a58eb1868-etc-swift podName:ea2cd8ea-3dbd-4076-b104-762a58eb1868 nodeName:}" failed. No retries permitted until 2025-10-02 13:16:34.872385245 +0000 UTC m=+1059.327144366 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/ea2cd8ea-3dbd-4076-b104-762a58eb1868-etc-swift") pod "swift-proxy-59cb459c9f-9zrgv" (UID: "ea2cd8ea-3dbd-4076-b104-762a58eb1868") : configmap "swift-ring-files" not found Oct 02 13:16:20 crc kubenswrapper[4724]: I1002 13:16:20.012213 4724 generic.go:334] "Generic (PLEG): container finished" podID="f0e00bd8-bde9-44dd-b71e-ec362b86bd23" containerID="67ae6e463644e82d1fd5c0b1cb549872a354f1b2284f1085869bca85c4cd2af3" exitCode=0 Oct 02 13:16:20 crc kubenswrapper[4724]: I1002 13:16:20.012327 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/swift-ring-rebalance-ml7nk" event={"ID":"f0e00bd8-bde9-44dd-b71e-ec362b86bd23","Type":"ContainerDied","Data":"67ae6e463644e82d1fd5c0b1cb549872a354f1b2284f1085869bca85c4cd2af3"} Oct 02 13:16:21 crc kubenswrapper[4724]: I1002 13:16:21.328023 4724 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/swift-ring-rebalance-ml7nk" Oct 02 13:16:21 crc kubenswrapper[4724]: I1002 13:16:21.420348 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/f0e00bd8-bde9-44dd-b71e-ec362b86bd23-swiftconf\") pod \"f0e00bd8-bde9-44dd-b71e-ec362b86bd23\" (UID: \"f0e00bd8-bde9-44dd-b71e-ec362b86bd23\") " Oct 02 13:16:21 crc kubenswrapper[4724]: I1002 13:16:21.420519 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/f0e00bd8-bde9-44dd-b71e-ec362b86bd23-dispersionconf\") pod \"f0e00bd8-bde9-44dd-b71e-ec362b86bd23\" (UID: \"f0e00bd8-bde9-44dd-b71e-ec362b86bd23\") " Oct 02 13:16:21 crc kubenswrapper[4724]: I1002 13:16:21.420985 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kzf8j\" (UniqueName: \"kubernetes.io/projected/f0e00bd8-bde9-44dd-b71e-ec362b86bd23-kube-api-access-kzf8j\") pod \"f0e00bd8-bde9-44dd-b71e-ec362b86bd23\" (UID: \"f0e00bd8-bde9-44dd-b71e-ec362b86bd23\") " Oct 02 13:16:21 crc kubenswrapper[4724]: I1002 13:16:21.421023 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f0e00bd8-bde9-44dd-b71e-ec362b86bd23-scripts\") pod \"f0e00bd8-bde9-44dd-b71e-ec362b86bd23\" (UID: \"f0e00bd8-bde9-44dd-b71e-ec362b86bd23\") " Oct 02 13:16:21 crc kubenswrapper[4724]: I1002 13:16:21.421050 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/f0e00bd8-bde9-44dd-b71e-ec362b86bd23-ring-data-devices\") pod \"f0e00bd8-bde9-44dd-b71e-ec362b86bd23\" (UID: \"f0e00bd8-bde9-44dd-b71e-ec362b86bd23\") " Oct 02 13:16:21 crc kubenswrapper[4724]: I1002 13:16:21.421084 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/f0e00bd8-bde9-44dd-b71e-ec362b86bd23-etc-swift\") pod \"f0e00bd8-bde9-44dd-b71e-ec362b86bd23\" (UID: \"f0e00bd8-bde9-44dd-b71e-ec362b86bd23\") " Oct 02 13:16:21 crc kubenswrapper[4724]: I1002 13:16:21.422404 4724 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f0e00bd8-bde9-44dd-b71e-ec362b86bd23-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "f0e00bd8-bde9-44dd-b71e-ec362b86bd23" (UID: "f0e00bd8-bde9-44dd-b71e-ec362b86bd23"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 13:16:21 crc kubenswrapper[4724]: I1002 13:16:21.423048 4724 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f0e00bd8-bde9-44dd-b71e-ec362b86bd23-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "f0e00bd8-bde9-44dd-b71e-ec362b86bd23" (UID: "f0e00bd8-bde9-44dd-b71e-ec362b86bd23"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 13:16:21 crc kubenswrapper[4724]: I1002 13:16:21.443256 4724 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f0e00bd8-bde9-44dd-b71e-ec362b86bd23-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "f0e00bd8-bde9-44dd-b71e-ec362b86bd23" (UID: "f0e00bd8-bde9-44dd-b71e-ec362b86bd23"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 13:16:21 crc kubenswrapper[4724]: I1002 13:16:21.445116 4724 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f0e00bd8-bde9-44dd-b71e-ec362b86bd23-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "f0e00bd8-bde9-44dd-b71e-ec362b86bd23" (UID: "f0e00bd8-bde9-44dd-b71e-ec362b86bd23"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 13:16:21 crc kubenswrapper[4724]: I1002 13:16:21.445469 4724 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f0e00bd8-bde9-44dd-b71e-ec362b86bd23-kube-api-access-kzf8j" (OuterVolumeSpecName: "kube-api-access-kzf8j") pod "f0e00bd8-bde9-44dd-b71e-ec362b86bd23" (UID: "f0e00bd8-bde9-44dd-b71e-ec362b86bd23"). InnerVolumeSpecName "kube-api-access-kzf8j". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 13:16:21 crc kubenswrapper[4724]: I1002 13:16:21.447795 4724 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f0e00bd8-bde9-44dd-b71e-ec362b86bd23-scripts" (OuterVolumeSpecName: "scripts") pod "f0e00bd8-bde9-44dd-b71e-ec362b86bd23" (UID: "f0e00bd8-bde9-44dd-b71e-ec362b86bd23"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 13:16:21 crc kubenswrapper[4724]: I1002 13:16:21.523023 4724 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f0e00bd8-bde9-44dd-b71e-ec362b86bd23-scripts\") on node \"crc\" DevicePath \"\"" Oct 02 13:16:21 crc kubenswrapper[4724]: I1002 13:16:21.523063 4724 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/f0e00bd8-bde9-44dd-b71e-ec362b86bd23-ring-data-devices\") on node \"crc\" DevicePath \"\"" Oct 02 13:16:21 crc kubenswrapper[4724]: I1002 13:16:21.523077 4724 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/f0e00bd8-bde9-44dd-b71e-ec362b86bd23-etc-swift\") on node \"crc\" DevicePath \"\"" Oct 02 13:16:21 crc kubenswrapper[4724]: I1002 13:16:21.523089 4724 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/f0e00bd8-bde9-44dd-b71e-ec362b86bd23-swiftconf\") on node \"crc\" DevicePath \"\"" Oct 02 13:16:21 crc kubenswrapper[4724]: I1002 13:16:21.523102 4724 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/f0e00bd8-bde9-44dd-b71e-ec362b86bd23-dispersionconf\") on node \"crc\" DevicePath \"\"" Oct 02 13:16:21 crc kubenswrapper[4724]: I1002 13:16:21.523114 4724 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kzf8j\" (UniqueName: \"kubernetes.io/projected/f0e00bd8-bde9-44dd-b71e-ec362b86bd23-kube-api-access-kzf8j\") on node \"crc\" DevicePath \"\"" Oct 02 13:16:22 crc kubenswrapper[4724]: I1002 13:16:22.031405 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/swift-ring-rebalance-ml7nk" event={"ID":"f0e00bd8-bde9-44dd-b71e-ec362b86bd23","Type":"ContainerDied","Data":"0d1eefbe16f29b69f59ce185f7a97bfd560aa3667b5d6db4d2a48c39e21f5c13"} Oct 02 13:16:22 crc kubenswrapper[4724]: I1002 13:16:22.031466 4724 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0d1eefbe16f29b69f59ce185f7a97bfd560aa3667b5d6db4d2a48c39e21f5c13" Oct 02 13:16:22 crc kubenswrapper[4724]: I1002 13:16:22.031466 4724 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/swift-ring-rebalance-ml7nk" Oct 02 13:16:24 crc kubenswrapper[4724]: I1002 13:16:24.933615 4724 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ac8ca47b6b256283a3e13fdd5b2e3854465d68c59374f2edea232a7cb3nvnvh"] Oct 02 13:16:24 crc kubenswrapper[4724]: E1002 13:16:24.933946 4724 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f0e00bd8-bde9-44dd-b71e-ec362b86bd23" containerName="swift-ring-rebalance" Oct 02 13:16:24 crc kubenswrapper[4724]: I1002 13:16:24.933960 4724 state_mem.go:107] "Deleted CPUSet assignment" podUID="f0e00bd8-bde9-44dd-b71e-ec362b86bd23" containerName="swift-ring-rebalance" Oct 02 13:16:24 crc kubenswrapper[4724]: E1002 13:16:24.933972 4724 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8ee33465-a769-4fb1-8c79-5e2d7f46799e" containerName="registry-server" Oct 02 13:16:24 crc kubenswrapper[4724]: I1002 13:16:24.933979 4724 state_mem.go:107] "Deleted CPUSet assignment" podUID="8ee33465-a769-4fb1-8c79-5e2d7f46799e" containerName="registry-server" Oct 02 13:16:24 crc kubenswrapper[4724]: I1002 13:16:24.934138 4724 memory_manager.go:354] "RemoveStaleState removing state" podUID="8ee33465-a769-4fb1-8c79-5e2d7f46799e" containerName="registry-server" Oct 02 13:16:24 crc kubenswrapper[4724]: I1002 13:16:24.934153 4724 memory_manager.go:354] "RemoveStaleState removing state" podUID="f0e00bd8-bde9-44dd-b71e-ec362b86bd23" containerName="swift-ring-rebalance" Oct 02 13:16:24 crc kubenswrapper[4724]: I1002 13:16:24.935199 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ac8ca47b6b256283a3e13fdd5b2e3854465d68c59374f2edea232a7cb3nvnvh" Oct 02 13:16:24 crc kubenswrapper[4724]: I1002 13:16:24.937989 4724 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"default-dockercfg-j9mmh" Oct 02 13:16:24 crc kubenswrapper[4724]: I1002 13:16:24.955336 4724 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ac8ca47b6b256283a3e13fdd5b2e3854465d68c59374f2edea232a7cb3nvnvh"] Oct 02 13:16:25 crc kubenswrapper[4724]: I1002 13:16:25.100588 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lh56v\" (UniqueName: \"kubernetes.io/projected/51f95db5-8f9f-449c-8bc7-04ebf10c4f97-kube-api-access-lh56v\") pod \"ac8ca47b6b256283a3e13fdd5b2e3854465d68c59374f2edea232a7cb3nvnvh\" (UID: \"51f95db5-8f9f-449c-8bc7-04ebf10c4f97\") " pod="openstack-operators/ac8ca47b6b256283a3e13fdd5b2e3854465d68c59374f2edea232a7cb3nvnvh" Oct 02 13:16:25 crc kubenswrapper[4724]: I1002 13:16:25.100643 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/51f95db5-8f9f-449c-8bc7-04ebf10c4f97-bundle\") pod \"ac8ca47b6b256283a3e13fdd5b2e3854465d68c59374f2edea232a7cb3nvnvh\" (UID: \"51f95db5-8f9f-449c-8bc7-04ebf10c4f97\") " pod="openstack-operators/ac8ca47b6b256283a3e13fdd5b2e3854465d68c59374f2edea232a7cb3nvnvh" Oct 02 13:16:25 crc kubenswrapper[4724]: I1002 13:16:25.100726 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/51f95db5-8f9f-449c-8bc7-04ebf10c4f97-util\") pod \"ac8ca47b6b256283a3e13fdd5b2e3854465d68c59374f2edea232a7cb3nvnvh\" (UID: \"51f95db5-8f9f-449c-8bc7-04ebf10c4f97\") " pod="openstack-operators/ac8ca47b6b256283a3e13fdd5b2e3854465d68c59374f2edea232a7cb3nvnvh" Oct 02 13:16:25 crc kubenswrapper[4724]: I1002 13:16:25.202555 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/51f95db5-8f9f-449c-8bc7-04ebf10c4f97-util\") pod \"ac8ca47b6b256283a3e13fdd5b2e3854465d68c59374f2edea232a7cb3nvnvh\" (UID: \"51f95db5-8f9f-449c-8bc7-04ebf10c4f97\") " pod="openstack-operators/ac8ca47b6b256283a3e13fdd5b2e3854465d68c59374f2edea232a7cb3nvnvh" Oct 02 13:16:25 crc kubenswrapper[4724]: I1002 13:16:25.202727 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lh56v\" (UniqueName: \"kubernetes.io/projected/51f95db5-8f9f-449c-8bc7-04ebf10c4f97-kube-api-access-lh56v\") pod \"ac8ca47b6b256283a3e13fdd5b2e3854465d68c59374f2edea232a7cb3nvnvh\" (UID: \"51f95db5-8f9f-449c-8bc7-04ebf10c4f97\") " pod="openstack-operators/ac8ca47b6b256283a3e13fdd5b2e3854465d68c59374f2edea232a7cb3nvnvh" Oct 02 13:16:25 crc kubenswrapper[4724]: I1002 13:16:25.202774 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/51f95db5-8f9f-449c-8bc7-04ebf10c4f97-bundle\") pod \"ac8ca47b6b256283a3e13fdd5b2e3854465d68c59374f2edea232a7cb3nvnvh\" (UID: \"51f95db5-8f9f-449c-8bc7-04ebf10c4f97\") " pod="openstack-operators/ac8ca47b6b256283a3e13fdd5b2e3854465d68c59374f2edea232a7cb3nvnvh" Oct 02 13:16:25 crc kubenswrapper[4724]: I1002 13:16:25.203220 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/51f95db5-8f9f-449c-8bc7-04ebf10c4f97-util\") pod \"ac8ca47b6b256283a3e13fdd5b2e3854465d68c59374f2edea232a7cb3nvnvh\" (UID: \"51f95db5-8f9f-449c-8bc7-04ebf10c4f97\") " pod="openstack-operators/ac8ca47b6b256283a3e13fdd5b2e3854465d68c59374f2edea232a7cb3nvnvh" Oct 02 13:16:25 crc kubenswrapper[4724]: I1002 13:16:25.203296 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/51f95db5-8f9f-449c-8bc7-04ebf10c4f97-bundle\") pod \"ac8ca47b6b256283a3e13fdd5b2e3854465d68c59374f2edea232a7cb3nvnvh\" (UID: \"51f95db5-8f9f-449c-8bc7-04ebf10c4f97\") " pod="openstack-operators/ac8ca47b6b256283a3e13fdd5b2e3854465d68c59374f2edea232a7cb3nvnvh" Oct 02 13:16:25 crc kubenswrapper[4724]: I1002 13:16:25.228792 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lh56v\" (UniqueName: \"kubernetes.io/projected/51f95db5-8f9f-449c-8bc7-04ebf10c4f97-kube-api-access-lh56v\") pod \"ac8ca47b6b256283a3e13fdd5b2e3854465d68c59374f2edea232a7cb3nvnvh\" (UID: \"51f95db5-8f9f-449c-8bc7-04ebf10c4f97\") " pod="openstack-operators/ac8ca47b6b256283a3e13fdd5b2e3854465d68c59374f2edea232a7cb3nvnvh" Oct 02 13:16:25 crc kubenswrapper[4724]: I1002 13:16:25.254461 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ac8ca47b6b256283a3e13fdd5b2e3854465d68c59374f2edea232a7cb3nvnvh" Oct 02 13:16:25 crc kubenswrapper[4724]: I1002 13:16:25.705900 4724 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ac8ca47b6b256283a3e13fdd5b2e3854465d68c59374f2edea232a7cb3nvnvh"] Oct 02 13:16:26 crc kubenswrapper[4724]: I1002 13:16:26.065967 4724 generic.go:334] "Generic (PLEG): container finished" podID="51f95db5-8f9f-449c-8bc7-04ebf10c4f97" containerID="92785a8904a811c52845c1b924e08b5ea34bd792c602bd724291556a1731759c" exitCode=0 Oct 02 13:16:26 crc kubenswrapper[4724]: I1002 13:16:26.066015 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ac8ca47b6b256283a3e13fdd5b2e3854465d68c59374f2edea232a7cb3nvnvh" event={"ID":"51f95db5-8f9f-449c-8bc7-04ebf10c4f97","Type":"ContainerDied","Data":"92785a8904a811c52845c1b924e08b5ea34bd792c602bd724291556a1731759c"} Oct 02 13:16:26 crc kubenswrapper[4724]: I1002 13:16:26.066046 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ac8ca47b6b256283a3e13fdd5b2e3854465d68c59374f2edea232a7cb3nvnvh" event={"ID":"51f95db5-8f9f-449c-8bc7-04ebf10c4f97","Type":"ContainerStarted","Data":"6e8e43c4c48288f93e31cd51c42f409f63bdc52738facdcada66666e55bc1b95"} Oct 02 13:16:27 crc kubenswrapper[4724]: I1002 13:16:27.077467 4724 generic.go:334] "Generic (PLEG): container finished" podID="51f95db5-8f9f-449c-8bc7-04ebf10c4f97" containerID="d05b9f4139205d0c950e17b4770be2084ea2c53e5c2904631036d964b0061dc5" exitCode=0 Oct 02 13:16:27 crc kubenswrapper[4724]: I1002 13:16:27.077573 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ac8ca47b6b256283a3e13fdd5b2e3854465d68c59374f2edea232a7cb3nvnvh" event={"ID":"51f95db5-8f9f-449c-8bc7-04ebf10c4f97","Type":"ContainerDied","Data":"d05b9f4139205d0c950e17b4770be2084ea2c53e5c2904631036d964b0061dc5"} Oct 02 13:16:28 crc kubenswrapper[4724]: I1002 13:16:28.088293 4724 generic.go:334] "Generic (PLEG): container finished" podID="51f95db5-8f9f-449c-8bc7-04ebf10c4f97" containerID="f0aabb39ac21e398a09b5baf8386f88601c4bf7eba196188abb1d1616f099573" exitCode=0 Oct 02 13:16:28 crc kubenswrapper[4724]: I1002 13:16:28.088388 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ac8ca47b6b256283a3e13fdd5b2e3854465d68c59374f2edea232a7cb3nvnvh" event={"ID":"51f95db5-8f9f-449c-8bc7-04ebf10c4f97","Type":"ContainerDied","Data":"f0aabb39ac21e398a09b5baf8386f88601c4bf7eba196188abb1d1616f099573"} Oct 02 13:16:29 crc kubenswrapper[4724]: I1002 13:16:29.429323 4724 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ac8ca47b6b256283a3e13fdd5b2e3854465d68c59374f2edea232a7cb3nvnvh" Oct 02 13:16:29 crc kubenswrapper[4724]: I1002 13:16:29.497902 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lh56v\" (UniqueName: \"kubernetes.io/projected/51f95db5-8f9f-449c-8bc7-04ebf10c4f97-kube-api-access-lh56v\") pod \"51f95db5-8f9f-449c-8bc7-04ebf10c4f97\" (UID: \"51f95db5-8f9f-449c-8bc7-04ebf10c4f97\") " Oct 02 13:16:29 crc kubenswrapper[4724]: I1002 13:16:29.498264 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/51f95db5-8f9f-449c-8bc7-04ebf10c4f97-bundle\") pod \"51f95db5-8f9f-449c-8bc7-04ebf10c4f97\" (UID: \"51f95db5-8f9f-449c-8bc7-04ebf10c4f97\") " Oct 02 13:16:29 crc kubenswrapper[4724]: I1002 13:16:29.498300 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/51f95db5-8f9f-449c-8bc7-04ebf10c4f97-util\") pod \"51f95db5-8f9f-449c-8bc7-04ebf10c4f97\" (UID: \"51f95db5-8f9f-449c-8bc7-04ebf10c4f97\") " Oct 02 13:16:29 crc kubenswrapper[4724]: I1002 13:16:29.506701 4724 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/51f95db5-8f9f-449c-8bc7-04ebf10c4f97-kube-api-access-lh56v" (OuterVolumeSpecName: "kube-api-access-lh56v") pod "51f95db5-8f9f-449c-8bc7-04ebf10c4f97" (UID: "51f95db5-8f9f-449c-8bc7-04ebf10c4f97"). InnerVolumeSpecName "kube-api-access-lh56v". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 13:16:29 crc kubenswrapper[4724]: I1002 13:16:29.534305 4724 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/51f95db5-8f9f-449c-8bc7-04ebf10c4f97-bundle" (OuterVolumeSpecName: "bundle") pod "51f95db5-8f9f-449c-8bc7-04ebf10c4f97" (UID: "51f95db5-8f9f-449c-8bc7-04ebf10c4f97"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 13:16:29 crc kubenswrapper[4724]: I1002 13:16:29.543388 4724 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/51f95db5-8f9f-449c-8bc7-04ebf10c4f97-util" (OuterVolumeSpecName: "util") pod "51f95db5-8f9f-449c-8bc7-04ebf10c4f97" (UID: "51f95db5-8f9f-449c-8bc7-04ebf10c4f97"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 13:16:29 crc kubenswrapper[4724]: I1002 13:16:29.600181 4724 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/51f95db5-8f9f-449c-8bc7-04ebf10c4f97-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 13:16:29 crc kubenswrapper[4724]: I1002 13:16:29.600224 4724 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/51f95db5-8f9f-449c-8bc7-04ebf10c4f97-util\") on node \"crc\" DevicePath \"\"" Oct 02 13:16:29 crc kubenswrapper[4724]: I1002 13:16:29.600235 4724 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lh56v\" (UniqueName: \"kubernetes.io/projected/51f95db5-8f9f-449c-8bc7-04ebf10c4f97-kube-api-access-lh56v\") on node \"crc\" DevicePath \"\"" Oct 02 13:16:30 crc kubenswrapper[4724]: I1002 13:16:30.107107 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ac8ca47b6b256283a3e13fdd5b2e3854465d68c59374f2edea232a7cb3nvnvh" event={"ID":"51f95db5-8f9f-449c-8bc7-04ebf10c4f97","Type":"ContainerDied","Data":"6e8e43c4c48288f93e31cd51c42f409f63bdc52738facdcada66666e55bc1b95"} Oct 02 13:16:30 crc kubenswrapper[4724]: I1002 13:16:30.107769 4724 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6e8e43c4c48288f93e31cd51c42f409f63bdc52738facdcada66666e55bc1b95" Oct 02 13:16:30 crc kubenswrapper[4724]: I1002 13:16:30.107192 4724 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ac8ca47b6b256283a3e13fdd5b2e3854465d68c59374f2edea232a7cb3nvnvh" Oct 02 13:16:32 crc kubenswrapper[4724]: I1002 13:16:32.033944 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/bee480a1-1b36-4795-befb-3a6d39bab686-etc-swift\") pod \"swift-storage-0\" (UID: \"bee480a1-1b36-4795-befb-3a6d39bab686\") " pod="glance-kuttl-tests/swift-storage-0" Oct 02 13:16:32 crc kubenswrapper[4724]: I1002 13:16:32.041377 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/bee480a1-1b36-4795-befb-3a6d39bab686-etc-swift\") pod \"swift-storage-0\" (UID: \"bee480a1-1b36-4795-befb-3a6d39bab686\") " pod="glance-kuttl-tests/swift-storage-0" Oct 02 13:16:32 crc kubenswrapper[4724]: I1002 13:16:32.297050 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/swift-storage-0" Oct 02 13:16:32 crc kubenswrapper[4724]: I1002 13:16:32.743920 4724 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/swift-storage-0"] Oct 02 13:16:33 crc kubenswrapper[4724]: I1002 13:16:33.129571 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/swift-storage-0" event={"ID":"bee480a1-1b36-4795-befb-3a6d39bab686","Type":"ContainerStarted","Data":"a469204b27b07fe7810b8427c529f38c4a86a5e9b176c916eb2dd2bd34033cb7"} Oct 02 13:16:34 crc kubenswrapper[4724]: I1002 13:16:34.884403 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/ea2cd8ea-3dbd-4076-b104-762a58eb1868-etc-swift\") pod \"swift-proxy-59cb459c9f-9zrgv\" (UID: \"ea2cd8ea-3dbd-4076-b104-762a58eb1868\") " pod="glance-kuttl-tests/swift-proxy-59cb459c9f-9zrgv" Oct 02 13:16:34 crc kubenswrapper[4724]: I1002 13:16:34.908713 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/ea2cd8ea-3dbd-4076-b104-762a58eb1868-etc-swift\") pod \"swift-proxy-59cb459c9f-9zrgv\" (UID: \"ea2cd8ea-3dbd-4076-b104-762a58eb1868\") " pod="glance-kuttl-tests/swift-proxy-59cb459c9f-9zrgv" Oct 02 13:16:35 crc kubenswrapper[4724]: I1002 13:16:35.048802 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/swift-proxy-59cb459c9f-9zrgv" Oct 02 13:16:35 crc kubenswrapper[4724]: I1002 13:16:35.163410 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/swift-storage-0" event={"ID":"bee480a1-1b36-4795-befb-3a6d39bab686","Type":"ContainerStarted","Data":"1f4de5e98b4cacc553a6fbe9ba91a94781261dc366c1682df8e5e54adeb067b7"} Oct 02 13:16:35 crc kubenswrapper[4724]: I1002 13:16:35.163458 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/swift-storage-0" event={"ID":"bee480a1-1b36-4795-befb-3a6d39bab686","Type":"ContainerStarted","Data":"b6bdd1f3c9a33fa5ea4593ae2feefc417dee23e73ae9c1784e62105d710171c5"} Oct 02 13:16:35 crc kubenswrapper[4724]: I1002 13:16:35.163467 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/swift-storage-0" event={"ID":"bee480a1-1b36-4795-befb-3a6d39bab686","Type":"ContainerStarted","Data":"e3275a79fbd387304448805032ac142711a4cd37c55fe4ce132d2eddbe7a4c6e"} Oct 02 13:16:35 crc kubenswrapper[4724]: I1002 13:16:35.163477 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/swift-storage-0" event={"ID":"bee480a1-1b36-4795-befb-3a6d39bab686","Type":"ContainerStarted","Data":"dd5f227b9343f06d2d92f0903f76e05944f227486185a548288a5690b77161cb"} Oct 02 13:16:35 crc kubenswrapper[4724]: I1002 13:16:35.312935 4724 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/swift-proxy-59cb459c9f-9zrgv"] Oct 02 13:16:36 crc kubenswrapper[4724]: I1002 13:16:36.181890 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/swift-proxy-59cb459c9f-9zrgv" event={"ID":"ea2cd8ea-3dbd-4076-b104-762a58eb1868","Type":"ContainerStarted","Data":"44a126c96c9a9f533028c0c00efcbaa2b7ee35a5370382312cb02050909121b3"} Oct 02 13:16:36 crc kubenswrapper[4724]: I1002 13:16:36.182636 4724 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/swift-proxy-59cb459c9f-9zrgv" Oct 02 13:16:36 crc kubenswrapper[4724]: I1002 13:16:36.182654 4724 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/swift-proxy-59cb459c9f-9zrgv" Oct 02 13:16:36 crc kubenswrapper[4724]: I1002 13:16:36.182663 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/swift-proxy-59cb459c9f-9zrgv" event={"ID":"ea2cd8ea-3dbd-4076-b104-762a58eb1868","Type":"ContainerStarted","Data":"1b421ad86a9b09d4cad2f761933e2f6f8dd6a57cc4110884b1dda9c2ac42065d"} Oct 02 13:16:36 crc kubenswrapper[4724]: I1002 13:16:36.182673 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/swift-proxy-59cb459c9f-9zrgv" event={"ID":"ea2cd8ea-3dbd-4076-b104-762a58eb1868","Type":"ContainerStarted","Data":"ff366c6b572902ac248ffcb2358f715a9d897ba16d6f9432e3052aae1a581815"} Oct 02 13:16:36 crc kubenswrapper[4724]: I1002 13:16:36.197775 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/swift-storage-0" event={"ID":"bee480a1-1b36-4795-befb-3a6d39bab686","Type":"ContainerStarted","Data":"4a3146786fd1ecba5303dde1f8873a238f855ea9c801e21e1f3cc7e734d71271"} Oct 02 13:16:36 crc kubenswrapper[4724]: I1002 13:16:36.227398 4724 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="glance-kuttl-tests/swift-proxy-59cb459c9f-9zrgv" podStartSLOduration=34.227377533 podStartE2EDuration="34.227377533s" podCreationTimestamp="2025-10-02 13:16:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 13:16:36.225913955 +0000 UTC m=+1060.680673076" watchObservedRunningTime="2025-10-02 13:16:36.227377533 +0000 UTC m=+1060.682136654" Oct 02 13:16:37 crc kubenswrapper[4724]: I1002 13:16:37.208922 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/swift-storage-0" event={"ID":"bee480a1-1b36-4795-befb-3a6d39bab686","Type":"ContainerStarted","Data":"0533f6002882b315f07d8a237e9dfa354386487224d03ef86d7da8093f243ace"} Oct 02 13:16:37 crc kubenswrapper[4724]: I1002 13:16:37.209425 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/swift-storage-0" event={"ID":"bee480a1-1b36-4795-befb-3a6d39bab686","Type":"ContainerStarted","Data":"30f368e45bd87d4c7751fcfa41c7d5cc26423ab1191a021c0aca8f64986afb52"} Oct 02 13:16:37 crc kubenswrapper[4724]: I1002 13:16:37.209437 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/swift-storage-0" event={"ID":"bee480a1-1b36-4795-befb-3a6d39bab686","Type":"ContainerStarted","Data":"abe3034b40b475325886b8f65a33a39111693dc58a0444da26f0b05f2bf1ee49"} Oct 02 13:16:39 crc kubenswrapper[4724]: I1002 13:16:39.234748 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/swift-storage-0" event={"ID":"bee480a1-1b36-4795-befb-3a6d39bab686","Type":"ContainerStarted","Data":"648c442e6cbd6b1661389874b07d04fd42062bcb2f68afbebdca5593c4c69de5"} Oct 02 13:16:39 crc kubenswrapper[4724]: I1002 13:16:39.235382 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/swift-storage-0" event={"ID":"bee480a1-1b36-4795-befb-3a6d39bab686","Type":"ContainerStarted","Data":"11f03096bbf1bc60a2ca756a51c4bd4f59e85d84eb8fd724959c847ae5f11bfe"} Oct 02 13:16:39 crc kubenswrapper[4724]: I1002 13:16:39.235395 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/swift-storage-0" event={"ID":"bee480a1-1b36-4795-befb-3a6d39bab686","Type":"ContainerStarted","Data":"ee8483c23e2f8c8bcf4c79b126a60dd1a715f730e990379a2571af718c9da185"} Oct 02 13:16:39 crc kubenswrapper[4724]: I1002 13:16:39.235403 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/swift-storage-0" event={"ID":"bee480a1-1b36-4795-befb-3a6d39bab686","Type":"ContainerStarted","Data":"2bb57234a240975a67c78c075625dd613e1f6fb84dd3a5e9ff4fcd0e4882b3ad"} Oct 02 13:16:39 crc kubenswrapper[4724]: I1002 13:16:39.235415 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/swift-storage-0" event={"ID":"bee480a1-1b36-4795-befb-3a6d39bab686","Type":"ContainerStarted","Data":"f19fb88bd94ded79a1a66c50278f3b77959aa25f4985db7dcca89088fc8902ae"} Oct 02 13:16:40 crc kubenswrapper[4724]: I1002 13:16:40.067667 4724 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/swift-proxy-59cb459c9f-9zrgv" Oct 02 13:16:40 crc kubenswrapper[4724]: I1002 13:16:40.252721 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/swift-storage-0" event={"ID":"bee480a1-1b36-4795-befb-3a6d39bab686","Type":"ContainerStarted","Data":"9155b686be64eff17892d9392f00266ce92a684bd7a18a81ce584ae4fde5f907"} Oct 02 13:16:40 crc kubenswrapper[4724]: I1002 13:16:40.252789 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/swift-storage-0" event={"ID":"bee480a1-1b36-4795-befb-3a6d39bab686","Type":"ContainerStarted","Data":"09044e16a08c6b2cf8b6e001d72994930a1eb7f12532c49dad6af5863a34da0f"} Oct 02 13:16:40 crc kubenswrapper[4724]: I1002 13:16:40.291447 4724 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="glance-kuttl-tests/swift-storage-0" podStartSLOduration=35.951924466 podStartE2EDuration="41.291428099s" podCreationTimestamp="2025-10-02 13:15:59 +0000 UTC" firstStartedPulling="2025-10-02 13:16:32.754734707 +0000 UTC m=+1057.209493838" lastFinishedPulling="2025-10-02 13:16:38.09423835 +0000 UTC m=+1062.548997471" observedRunningTime="2025-10-02 13:16:40.286892001 +0000 UTC m=+1064.741651142" watchObservedRunningTime="2025-10-02 13:16:40.291428099 +0000 UTC m=+1064.746187220" Oct 02 13:16:41 crc kubenswrapper[4724]: I1002 13:16:41.190415 4724 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/glance-operator-controller-manager-7d4448d4c-tcwsh"] Oct 02 13:16:41 crc kubenswrapper[4724]: E1002 13:16:41.190884 4724 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="51f95db5-8f9f-449c-8bc7-04ebf10c4f97" containerName="pull" Oct 02 13:16:41 crc kubenswrapper[4724]: I1002 13:16:41.190909 4724 state_mem.go:107] "Deleted CPUSet assignment" podUID="51f95db5-8f9f-449c-8bc7-04ebf10c4f97" containerName="pull" Oct 02 13:16:41 crc kubenswrapper[4724]: E1002 13:16:41.190942 4724 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="51f95db5-8f9f-449c-8bc7-04ebf10c4f97" containerName="util" Oct 02 13:16:41 crc kubenswrapper[4724]: I1002 13:16:41.190950 4724 state_mem.go:107] "Deleted CPUSet assignment" podUID="51f95db5-8f9f-449c-8bc7-04ebf10c4f97" containerName="util" Oct 02 13:16:41 crc kubenswrapper[4724]: E1002 13:16:41.190969 4724 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="51f95db5-8f9f-449c-8bc7-04ebf10c4f97" containerName="extract" Oct 02 13:16:41 crc kubenswrapper[4724]: I1002 13:16:41.190978 4724 state_mem.go:107] "Deleted CPUSet assignment" podUID="51f95db5-8f9f-449c-8bc7-04ebf10c4f97" containerName="extract" Oct 02 13:16:41 crc kubenswrapper[4724]: I1002 13:16:41.191201 4724 memory_manager.go:354] "RemoveStaleState removing state" podUID="51f95db5-8f9f-449c-8bc7-04ebf10c4f97" containerName="extract" Oct 02 13:16:41 crc kubenswrapper[4724]: I1002 13:16:41.192296 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-7d4448d4c-tcwsh" Oct 02 13:16:41 crc kubenswrapper[4724]: I1002 13:16:41.196790 4724 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"glance-operator-controller-manager-dockercfg-dn6q5" Oct 02 13:16:41 crc kubenswrapper[4724]: I1002 13:16:41.197054 4724 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"glance-operator-controller-manager-service-cert" Oct 02 13:16:41 crc kubenswrapper[4724]: I1002 13:16:41.208076 4724 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-7d4448d4c-tcwsh"] Oct 02 13:16:41 crc kubenswrapper[4724]: I1002 13:16:41.299458 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/632aef42-b937-4494-bf3a-7251fb7fb975-apiservice-cert\") pod \"glance-operator-controller-manager-7d4448d4c-tcwsh\" (UID: \"632aef42-b937-4494-bf3a-7251fb7fb975\") " pod="openstack-operators/glance-operator-controller-manager-7d4448d4c-tcwsh" Oct 02 13:16:41 crc kubenswrapper[4724]: I1002 13:16:41.299652 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7prm5\" (UniqueName: \"kubernetes.io/projected/632aef42-b937-4494-bf3a-7251fb7fb975-kube-api-access-7prm5\") pod \"glance-operator-controller-manager-7d4448d4c-tcwsh\" (UID: \"632aef42-b937-4494-bf3a-7251fb7fb975\") " pod="openstack-operators/glance-operator-controller-manager-7d4448d4c-tcwsh" Oct 02 13:16:41 crc kubenswrapper[4724]: I1002 13:16:41.299696 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/632aef42-b937-4494-bf3a-7251fb7fb975-webhook-cert\") pod \"glance-operator-controller-manager-7d4448d4c-tcwsh\" (UID: \"632aef42-b937-4494-bf3a-7251fb7fb975\") " pod="openstack-operators/glance-operator-controller-manager-7d4448d4c-tcwsh" Oct 02 13:16:41 crc kubenswrapper[4724]: I1002 13:16:41.400676 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/632aef42-b937-4494-bf3a-7251fb7fb975-webhook-cert\") pod \"glance-operator-controller-manager-7d4448d4c-tcwsh\" (UID: \"632aef42-b937-4494-bf3a-7251fb7fb975\") " pod="openstack-operators/glance-operator-controller-manager-7d4448d4c-tcwsh" Oct 02 13:16:41 crc kubenswrapper[4724]: I1002 13:16:41.401246 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/632aef42-b937-4494-bf3a-7251fb7fb975-apiservice-cert\") pod \"glance-operator-controller-manager-7d4448d4c-tcwsh\" (UID: \"632aef42-b937-4494-bf3a-7251fb7fb975\") " pod="openstack-operators/glance-operator-controller-manager-7d4448d4c-tcwsh" Oct 02 13:16:41 crc kubenswrapper[4724]: I1002 13:16:41.401345 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7prm5\" (UniqueName: \"kubernetes.io/projected/632aef42-b937-4494-bf3a-7251fb7fb975-kube-api-access-7prm5\") pod \"glance-operator-controller-manager-7d4448d4c-tcwsh\" (UID: \"632aef42-b937-4494-bf3a-7251fb7fb975\") " pod="openstack-operators/glance-operator-controller-manager-7d4448d4c-tcwsh" Oct 02 13:16:41 crc kubenswrapper[4724]: I1002 13:16:41.424181 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/632aef42-b937-4494-bf3a-7251fb7fb975-webhook-cert\") pod \"glance-operator-controller-manager-7d4448d4c-tcwsh\" (UID: \"632aef42-b937-4494-bf3a-7251fb7fb975\") " pod="openstack-operators/glance-operator-controller-manager-7d4448d4c-tcwsh" Oct 02 13:16:41 crc kubenswrapper[4724]: I1002 13:16:41.429351 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/632aef42-b937-4494-bf3a-7251fb7fb975-apiservice-cert\") pod \"glance-operator-controller-manager-7d4448d4c-tcwsh\" (UID: \"632aef42-b937-4494-bf3a-7251fb7fb975\") " pod="openstack-operators/glance-operator-controller-manager-7d4448d4c-tcwsh" Oct 02 13:16:41 crc kubenswrapper[4724]: I1002 13:16:41.449593 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7prm5\" (UniqueName: \"kubernetes.io/projected/632aef42-b937-4494-bf3a-7251fb7fb975-kube-api-access-7prm5\") pod \"glance-operator-controller-manager-7d4448d4c-tcwsh\" (UID: \"632aef42-b937-4494-bf3a-7251fb7fb975\") " pod="openstack-operators/glance-operator-controller-manager-7d4448d4c-tcwsh" Oct 02 13:16:41 crc kubenswrapper[4724]: I1002 13:16:41.551799 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-7d4448d4c-tcwsh" Oct 02 13:16:42 crc kubenswrapper[4724]: W1002 13:16:42.089075 4724 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod632aef42_b937_4494_bf3a_7251fb7fb975.slice/crio-559b1ac24244b4c50d0d686bfa34d455e1952810c21f6b90dffa77beb57bffe0 WatchSource:0}: Error finding container 559b1ac24244b4c50d0d686bfa34d455e1952810c21f6b90dffa77beb57bffe0: Status 404 returned error can't find the container with id 559b1ac24244b4c50d0d686bfa34d455e1952810c21f6b90dffa77beb57bffe0 Oct 02 13:16:42 crc kubenswrapper[4724]: I1002 13:16:42.089094 4724 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-7d4448d4c-tcwsh"] Oct 02 13:16:42 crc kubenswrapper[4724]: I1002 13:16:42.270856 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-7d4448d4c-tcwsh" event={"ID":"632aef42-b937-4494-bf3a-7251fb7fb975","Type":"ContainerStarted","Data":"559b1ac24244b4c50d0d686bfa34d455e1952810c21f6b90dffa77beb57bffe0"} Oct 02 13:16:44 crc kubenswrapper[4724]: I1002 13:16:44.302721 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-7d4448d4c-tcwsh" event={"ID":"632aef42-b937-4494-bf3a-7251fb7fb975","Type":"ContainerStarted","Data":"0eab4c34112ac2dfdf42b69ddd9b8433ea607bfb99f2789fb997b230a2151d61"} Oct 02 13:16:45 crc kubenswrapper[4724]: I1002 13:16:45.055823 4724 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/swift-proxy-59cb459c9f-9zrgv" Oct 02 13:16:45 crc kubenswrapper[4724]: I1002 13:16:45.313341 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-7d4448d4c-tcwsh" event={"ID":"632aef42-b937-4494-bf3a-7251fb7fb975","Type":"ContainerStarted","Data":"06da6a4a1c695e2c822eccc6dc8b6897153c7bbcf4850e3529d4f2d8dffae78c"} Oct 02 13:16:45 crc kubenswrapper[4724]: I1002 13:16:45.313485 4724 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/glance-operator-controller-manager-7d4448d4c-tcwsh" Oct 02 13:16:45 crc kubenswrapper[4724]: I1002 13:16:45.335429 4724 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/glance-operator-controller-manager-7d4448d4c-tcwsh" podStartSLOduration=2.167405137 podStartE2EDuration="4.335403846s" podCreationTimestamp="2025-10-02 13:16:41 +0000 UTC" firstStartedPulling="2025-10-02 13:16:42.091803507 +0000 UTC m=+1066.546562628" lastFinishedPulling="2025-10-02 13:16:44.259802216 +0000 UTC m=+1068.714561337" observedRunningTime="2025-10-02 13:16:45.332221204 +0000 UTC m=+1069.786980325" watchObservedRunningTime="2025-10-02 13:16:45.335403846 +0000 UTC m=+1069.790162967" Oct 02 13:16:51 crc kubenswrapper[4724]: I1002 13:16:51.557958 4724 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/glance-operator-controller-manager-7d4448d4c-tcwsh" Oct 02 13:16:55 crc kubenswrapper[4724]: I1002 13:16:55.214793 4724 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/openstackclient"] Oct 02 13:16:55 crc kubenswrapper[4724]: I1002 13:16:55.216434 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/openstackclient" Oct 02 13:16:55 crc kubenswrapper[4724]: I1002 13:16:55.219359 4724 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/glance-db-create-sqnjs"] Oct 02 13:16:55 crc kubenswrapper[4724]: I1002 13:16:55.221788 4724 reflector.go:368] Caches populated for *v1.ConfigMap from object-"glance-kuttl-tests"/"openstack-config" Oct 02 13:16:55 crc kubenswrapper[4724]: I1002 13:16:55.222640 4724 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"openstack-config-secret" Oct 02 13:16:55 crc kubenswrapper[4724]: I1002 13:16:55.222848 4724 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"default-dockercfg-87nz2" Oct 02 13:16:55 crc kubenswrapper[4724]: I1002 13:16:55.223341 4724 reflector.go:368] Caches populated for *v1.ConfigMap from object-"glance-kuttl-tests"/"openstack-scripts-9db6gc427h" Oct 02 13:16:55 crc kubenswrapper[4724]: I1002 13:16:55.224086 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-db-create-sqnjs" Oct 02 13:16:55 crc kubenswrapper[4724]: I1002 13:16:55.237527 4724 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/openstackclient"] Oct 02 13:16:55 crc kubenswrapper[4724]: I1002 13:16:55.254329 4724 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-db-create-sqnjs"] Oct 02 13:16:55 crc kubenswrapper[4724]: I1002 13:16:55.347000 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/7de12d0c-65ee-44be-80d9-3b3256e71ada-openstack-config-secret\") pod \"openstackclient\" (UID: \"7de12d0c-65ee-44be-80d9-3b3256e71ada\") " pod="glance-kuttl-tests/openstackclient" Oct 02 13:16:55 crc kubenswrapper[4724]: I1002 13:16:55.347070 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-964qn\" (UniqueName: \"kubernetes.io/projected/be0e82aa-a6fd-4bfb-925e-573f0a6b960a-kube-api-access-964qn\") pod \"glance-db-create-sqnjs\" (UID: \"be0e82aa-a6fd-4bfb-925e-573f0a6b960a\") " pod="glance-kuttl-tests/glance-db-create-sqnjs" Oct 02 13:16:55 crc kubenswrapper[4724]: I1002 13:16:55.347109 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4cwvq\" (UniqueName: \"kubernetes.io/projected/7de12d0c-65ee-44be-80d9-3b3256e71ada-kube-api-access-4cwvq\") pod \"openstackclient\" (UID: \"7de12d0c-65ee-44be-80d9-3b3256e71ada\") " pod="glance-kuttl-tests/openstackclient" Oct 02 13:16:55 crc kubenswrapper[4724]: I1002 13:16:55.347169 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-scripts\" (UniqueName: \"kubernetes.io/configmap/7de12d0c-65ee-44be-80d9-3b3256e71ada-openstack-scripts\") pod \"openstackclient\" (UID: \"7de12d0c-65ee-44be-80d9-3b3256e71ada\") " pod="glance-kuttl-tests/openstackclient" Oct 02 13:16:55 crc kubenswrapper[4724]: I1002 13:16:55.347224 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/7de12d0c-65ee-44be-80d9-3b3256e71ada-openstack-config\") pod \"openstackclient\" (UID: \"7de12d0c-65ee-44be-80d9-3b3256e71ada\") " pod="glance-kuttl-tests/openstackclient" Oct 02 13:16:55 crc kubenswrapper[4724]: I1002 13:16:55.448659 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/7de12d0c-65ee-44be-80d9-3b3256e71ada-openstack-config-secret\") pod \"openstackclient\" (UID: \"7de12d0c-65ee-44be-80d9-3b3256e71ada\") " pod="glance-kuttl-tests/openstackclient" Oct 02 13:16:55 crc kubenswrapper[4724]: I1002 13:16:55.448758 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-964qn\" (UniqueName: \"kubernetes.io/projected/be0e82aa-a6fd-4bfb-925e-573f0a6b960a-kube-api-access-964qn\") pod \"glance-db-create-sqnjs\" (UID: \"be0e82aa-a6fd-4bfb-925e-573f0a6b960a\") " pod="glance-kuttl-tests/glance-db-create-sqnjs" Oct 02 13:16:55 crc kubenswrapper[4724]: I1002 13:16:55.448801 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4cwvq\" (UniqueName: \"kubernetes.io/projected/7de12d0c-65ee-44be-80d9-3b3256e71ada-kube-api-access-4cwvq\") pod \"openstackclient\" (UID: \"7de12d0c-65ee-44be-80d9-3b3256e71ada\") " pod="glance-kuttl-tests/openstackclient" Oct 02 13:16:55 crc kubenswrapper[4724]: I1002 13:16:55.448876 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-scripts\" (UniqueName: \"kubernetes.io/configmap/7de12d0c-65ee-44be-80d9-3b3256e71ada-openstack-scripts\") pod \"openstackclient\" (UID: \"7de12d0c-65ee-44be-80d9-3b3256e71ada\") " pod="glance-kuttl-tests/openstackclient" Oct 02 13:16:55 crc kubenswrapper[4724]: I1002 13:16:55.448910 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/7de12d0c-65ee-44be-80d9-3b3256e71ada-openstack-config\") pod \"openstackclient\" (UID: \"7de12d0c-65ee-44be-80d9-3b3256e71ada\") " pod="glance-kuttl-tests/openstackclient" Oct 02 13:16:55 crc kubenswrapper[4724]: I1002 13:16:55.450307 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-scripts\" (UniqueName: \"kubernetes.io/configmap/7de12d0c-65ee-44be-80d9-3b3256e71ada-openstack-scripts\") pod \"openstackclient\" (UID: \"7de12d0c-65ee-44be-80d9-3b3256e71ada\") " pod="glance-kuttl-tests/openstackclient" Oct 02 13:16:55 crc kubenswrapper[4724]: I1002 13:16:55.451005 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/7de12d0c-65ee-44be-80d9-3b3256e71ada-openstack-config\") pod \"openstackclient\" (UID: \"7de12d0c-65ee-44be-80d9-3b3256e71ada\") " pod="glance-kuttl-tests/openstackclient" Oct 02 13:16:55 crc kubenswrapper[4724]: I1002 13:16:55.457265 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/7de12d0c-65ee-44be-80d9-3b3256e71ada-openstack-config-secret\") pod \"openstackclient\" (UID: \"7de12d0c-65ee-44be-80d9-3b3256e71ada\") " pod="glance-kuttl-tests/openstackclient" Oct 02 13:16:55 crc kubenswrapper[4724]: I1002 13:16:55.473637 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4cwvq\" (UniqueName: \"kubernetes.io/projected/7de12d0c-65ee-44be-80d9-3b3256e71ada-kube-api-access-4cwvq\") pod \"openstackclient\" (UID: \"7de12d0c-65ee-44be-80d9-3b3256e71ada\") " pod="glance-kuttl-tests/openstackclient" Oct 02 13:16:55 crc kubenswrapper[4724]: I1002 13:16:55.474303 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-964qn\" (UniqueName: \"kubernetes.io/projected/be0e82aa-a6fd-4bfb-925e-573f0a6b960a-kube-api-access-964qn\") pod \"glance-db-create-sqnjs\" (UID: \"be0e82aa-a6fd-4bfb-925e-573f0a6b960a\") " pod="glance-kuttl-tests/glance-db-create-sqnjs" Oct 02 13:16:55 crc kubenswrapper[4724]: I1002 13:16:55.538560 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/openstackclient" Oct 02 13:16:55 crc kubenswrapper[4724]: I1002 13:16:55.554008 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-db-create-sqnjs" Oct 02 13:16:56 crc kubenswrapper[4724]: I1002 13:16:56.035705 4724 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/openstackclient"] Oct 02 13:16:56 crc kubenswrapper[4724]: W1002 13:16:56.042566 4724 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7de12d0c_65ee_44be_80d9_3b3256e71ada.slice/crio-0c73e82513b58ff3ca8dd77284b135b0b2f42657f81bcc6fd597987a25b46716 WatchSource:0}: Error finding container 0c73e82513b58ff3ca8dd77284b135b0b2f42657f81bcc6fd597987a25b46716: Status 404 returned error can't find the container with id 0c73e82513b58ff3ca8dd77284b135b0b2f42657f81bcc6fd597987a25b46716 Oct 02 13:16:56 crc kubenswrapper[4724]: I1002 13:16:56.087872 4724 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-db-create-sqnjs"] Oct 02 13:16:56 crc kubenswrapper[4724]: W1002 13:16:56.090667 4724 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbe0e82aa_a6fd_4bfb_925e_573f0a6b960a.slice/crio-3e3d7b0f47e264b0f80cc0c2e1768da07b9d1eed5df86e79df8df13b94e678f5 WatchSource:0}: Error finding container 3e3d7b0f47e264b0f80cc0c2e1768da07b9d1eed5df86e79df8df13b94e678f5: Status 404 returned error can't find the container with id 3e3d7b0f47e264b0f80cc0c2e1768da07b9d1eed5df86e79df8df13b94e678f5 Oct 02 13:16:56 crc kubenswrapper[4724]: I1002 13:16:56.414549 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/openstackclient" event={"ID":"7de12d0c-65ee-44be-80d9-3b3256e71ada","Type":"ContainerStarted","Data":"0c73e82513b58ff3ca8dd77284b135b0b2f42657f81bcc6fd597987a25b46716"} Oct 02 13:16:56 crc kubenswrapper[4724]: I1002 13:16:56.417642 4724 generic.go:334] "Generic (PLEG): container finished" podID="be0e82aa-a6fd-4bfb-925e-573f0a6b960a" containerID="855ba79ee2030b76fd14e92a5fbea63c6147672e90d9e04bb1c29e6712153ee2" exitCode=0 Oct 02 13:16:56 crc kubenswrapper[4724]: I1002 13:16:56.417813 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-db-create-sqnjs" event={"ID":"be0e82aa-a6fd-4bfb-925e-573f0a6b960a","Type":"ContainerDied","Data":"855ba79ee2030b76fd14e92a5fbea63c6147672e90d9e04bb1c29e6712153ee2"} Oct 02 13:16:56 crc kubenswrapper[4724]: I1002 13:16:56.417913 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-db-create-sqnjs" event={"ID":"be0e82aa-a6fd-4bfb-925e-573f0a6b960a","Type":"ContainerStarted","Data":"3e3d7b0f47e264b0f80cc0c2e1768da07b9d1eed5df86e79df8df13b94e678f5"} Oct 02 13:16:57 crc kubenswrapper[4724]: I1002 13:16:57.780872 4724 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-db-create-sqnjs" Oct 02 13:16:57 crc kubenswrapper[4724]: I1002 13:16:57.897162 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-964qn\" (UniqueName: \"kubernetes.io/projected/be0e82aa-a6fd-4bfb-925e-573f0a6b960a-kube-api-access-964qn\") pod \"be0e82aa-a6fd-4bfb-925e-573f0a6b960a\" (UID: \"be0e82aa-a6fd-4bfb-925e-573f0a6b960a\") " Oct 02 13:16:57 crc kubenswrapper[4724]: I1002 13:16:57.904732 4724 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/be0e82aa-a6fd-4bfb-925e-573f0a6b960a-kube-api-access-964qn" (OuterVolumeSpecName: "kube-api-access-964qn") pod "be0e82aa-a6fd-4bfb-925e-573f0a6b960a" (UID: "be0e82aa-a6fd-4bfb-925e-573f0a6b960a"). InnerVolumeSpecName "kube-api-access-964qn". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 13:16:57 crc kubenswrapper[4724]: I1002 13:16:57.999657 4724 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-964qn\" (UniqueName: \"kubernetes.io/projected/be0e82aa-a6fd-4bfb-925e-573f0a6b960a-kube-api-access-964qn\") on node \"crc\" DevicePath \"\"" Oct 02 13:16:58 crc kubenswrapper[4724]: I1002 13:16:58.438460 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-db-create-sqnjs" event={"ID":"be0e82aa-a6fd-4bfb-925e-573f0a6b960a","Type":"ContainerDied","Data":"3e3d7b0f47e264b0f80cc0c2e1768da07b9d1eed5df86e79df8df13b94e678f5"} Oct 02 13:16:58 crc kubenswrapper[4724]: I1002 13:16:58.438871 4724 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3e3d7b0f47e264b0f80cc0c2e1768da07b9d1eed5df86e79df8df13b94e678f5" Oct 02 13:16:58 crc kubenswrapper[4724]: I1002 13:16:58.438564 4724 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-db-create-sqnjs" Oct 02 13:17:05 crc kubenswrapper[4724]: I1002 13:17:05.235852 4724 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/glance-f957-account-create-pxkch"] Oct 02 13:17:05 crc kubenswrapper[4724]: E1002 13:17:05.237098 4724 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="be0e82aa-a6fd-4bfb-925e-573f0a6b960a" containerName="mariadb-database-create" Oct 02 13:17:05 crc kubenswrapper[4724]: I1002 13:17:05.237124 4724 state_mem.go:107] "Deleted CPUSet assignment" podUID="be0e82aa-a6fd-4bfb-925e-573f0a6b960a" containerName="mariadb-database-create" Oct 02 13:17:05 crc kubenswrapper[4724]: I1002 13:17:05.237455 4724 memory_manager.go:354] "RemoveStaleState removing state" podUID="be0e82aa-a6fd-4bfb-925e-573f0a6b960a" containerName="mariadb-database-create" Oct 02 13:17:05 crc kubenswrapper[4724]: I1002 13:17:05.238475 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-f957-account-create-pxkch" Oct 02 13:17:05 crc kubenswrapper[4724]: I1002 13:17:05.241166 4724 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"glance-db-secret" Oct 02 13:17:05 crc kubenswrapper[4724]: I1002 13:17:05.243510 4724 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-f957-account-create-pxkch"] Oct 02 13:17:05 crc kubenswrapper[4724]: I1002 13:17:05.331234 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4mbwq\" (UniqueName: \"kubernetes.io/projected/c45996f0-02dc-44bc-af15-c6e272ccc2f8-kube-api-access-4mbwq\") pod \"glance-f957-account-create-pxkch\" (UID: \"c45996f0-02dc-44bc-af15-c6e272ccc2f8\") " pod="glance-kuttl-tests/glance-f957-account-create-pxkch" Oct 02 13:17:05 crc kubenswrapper[4724]: I1002 13:17:05.432791 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4mbwq\" (UniqueName: \"kubernetes.io/projected/c45996f0-02dc-44bc-af15-c6e272ccc2f8-kube-api-access-4mbwq\") pod \"glance-f957-account-create-pxkch\" (UID: \"c45996f0-02dc-44bc-af15-c6e272ccc2f8\") " pod="glance-kuttl-tests/glance-f957-account-create-pxkch" Oct 02 13:17:05 crc kubenswrapper[4724]: I1002 13:17:05.458396 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4mbwq\" (UniqueName: \"kubernetes.io/projected/c45996f0-02dc-44bc-af15-c6e272ccc2f8-kube-api-access-4mbwq\") pod \"glance-f957-account-create-pxkch\" (UID: \"c45996f0-02dc-44bc-af15-c6e272ccc2f8\") " pod="glance-kuttl-tests/glance-f957-account-create-pxkch" Oct 02 13:17:05 crc kubenswrapper[4724]: I1002 13:17:05.507764 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/openstackclient" event={"ID":"7de12d0c-65ee-44be-80d9-3b3256e71ada","Type":"ContainerStarted","Data":"1430f81098af0253dde0a989008abafa704a1e00272ea537638071609d1242fe"} Oct 02 13:17:05 crc kubenswrapper[4724]: I1002 13:17:05.532513 4724 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="glance-kuttl-tests/openstackclient" podStartSLOduration=2.320617572 podStartE2EDuration="10.532485172s" podCreationTimestamp="2025-10-02 13:16:55 +0000 UTC" firstStartedPulling="2025-10-02 13:16:56.045822521 +0000 UTC m=+1080.500581642" lastFinishedPulling="2025-10-02 13:17:04.257690111 +0000 UTC m=+1088.712449242" observedRunningTime="2025-10-02 13:17:05.528109088 +0000 UTC m=+1089.982868209" watchObservedRunningTime="2025-10-02 13:17:05.532485172 +0000 UTC m=+1089.987244293" Oct 02 13:17:05 crc kubenswrapper[4724]: I1002 13:17:05.578704 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-f957-account-create-pxkch" Oct 02 13:17:06 crc kubenswrapper[4724]: I1002 13:17:06.062501 4724 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-f957-account-create-pxkch"] Oct 02 13:17:06 crc kubenswrapper[4724]: W1002 13:17:06.072315 4724 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc45996f0_02dc_44bc_af15_c6e272ccc2f8.slice/crio-809c64bfa92d76c0b5a60c2b96757342752aeb84426dec1d971f6bb613443642 WatchSource:0}: Error finding container 809c64bfa92d76c0b5a60c2b96757342752aeb84426dec1d971f6bb613443642: Status 404 returned error can't find the container with id 809c64bfa92d76c0b5a60c2b96757342752aeb84426dec1d971f6bb613443642 Oct 02 13:17:06 crc kubenswrapper[4724]: I1002 13:17:06.520812 4724 generic.go:334] "Generic (PLEG): container finished" podID="c45996f0-02dc-44bc-af15-c6e272ccc2f8" containerID="af3133398d7664a5460f6ea8651515a58d074973e709d0f7249da215d86e3c20" exitCode=0 Oct 02 13:17:06 crc kubenswrapper[4724]: I1002 13:17:06.520944 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-f957-account-create-pxkch" event={"ID":"c45996f0-02dc-44bc-af15-c6e272ccc2f8","Type":"ContainerDied","Data":"af3133398d7664a5460f6ea8651515a58d074973e709d0f7249da215d86e3c20"} Oct 02 13:17:06 crc kubenswrapper[4724]: I1002 13:17:06.521467 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-f957-account-create-pxkch" event={"ID":"c45996f0-02dc-44bc-af15-c6e272ccc2f8","Type":"ContainerStarted","Data":"809c64bfa92d76c0b5a60c2b96757342752aeb84426dec1d971f6bb613443642"} Oct 02 13:17:07 crc kubenswrapper[4724]: I1002 13:17:07.892571 4724 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-f957-account-create-pxkch" Oct 02 13:17:07 crc kubenswrapper[4724]: I1002 13:17:07.981224 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4mbwq\" (UniqueName: \"kubernetes.io/projected/c45996f0-02dc-44bc-af15-c6e272ccc2f8-kube-api-access-4mbwq\") pod \"c45996f0-02dc-44bc-af15-c6e272ccc2f8\" (UID: \"c45996f0-02dc-44bc-af15-c6e272ccc2f8\") " Oct 02 13:17:07 crc kubenswrapper[4724]: I1002 13:17:07.991844 4724 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c45996f0-02dc-44bc-af15-c6e272ccc2f8-kube-api-access-4mbwq" (OuterVolumeSpecName: "kube-api-access-4mbwq") pod "c45996f0-02dc-44bc-af15-c6e272ccc2f8" (UID: "c45996f0-02dc-44bc-af15-c6e272ccc2f8"). InnerVolumeSpecName "kube-api-access-4mbwq". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 13:17:08 crc kubenswrapper[4724]: I1002 13:17:08.083342 4724 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4mbwq\" (UniqueName: \"kubernetes.io/projected/c45996f0-02dc-44bc-af15-c6e272ccc2f8-kube-api-access-4mbwq\") on node \"crc\" DevicePath \"\"" Oct 02 13:17:08 crc kubenswrapper[4724]: I1002 13:17:08.540359 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-f957-account-create-pxkch" event={"ID":"c45996f0-02dc-44bc-af15-c6e272ccc2f8","Type":"ContainerDied","Data":"809c64bfa92d76c0b5a60c2b96757342752aeb84426dec1d971f6bb613443642"} Oct 02 13:17:08 crc kubenswrapper[4724]: I1002 13:17:08.540922 4724 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="809c64bfa92d76c0b5a60c2b96757342752aeb84426dec1d971f6bb613443642" Oct 02 13:17:08 crc kubenswrapper[4724]: I1002 13:17:08.540621 4724 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-f957-account-create-pxkch" Oct 02 13:17:10 crc kubenswrapper[4724]: I1002 13:17:10.390476 4724 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/glance-db-sync-ntrlf"] Oct 02 13:17:10 crc kubenswrapper[4724]: E1002 13:17:10.391284 4724 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c45996f0-02dc-44bc-af15-c6e272ccc2f8" containerName="mariadb-account-create" Oct 02 13:17:10 crc kubenswrapper[4724]: I1002 13:17:10.391302 4724 state_mem.go:107] "Deleted CPUSet assignment" podUID="c45996f0-02dc-44bc-af15-c6e272ccc2f8" containerName="mariadb-account-create" Oct 02 13:17:10 crc kubenswrapper[4724]: I1002 13:17:10.391451 4724 memory_manager.go:354] "RemoveStaleState removing state" podUID="c45996f0-02dc-44bc-af15-c6e272ccc2f8" containerName="mariadb-account-create" Oct 02 13:17:10 crc kubenswrapper[4724]: I1002 13:17:10.392012 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-db-sync-ntrlf" Oct 02 13:17:10 crc kubenswrapper[4724]: I1002 13:17:10.395470 4724 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"glance-glance-dockercfg-njbfn" Oct 02 13:17:10 crc kubenswrapper[4724]: I1002 13:17:10.396060 4724 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"glance-config-data" Oct 02 13:17:10 crc kubenswrapper[4724]: I1002 13:17:10.417056 4724 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-db-sync-ntrlf"] Oct 02 13:17:10 crc kubenswrapper[4724]: I1002 13:17:10.526439 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7a00d229-c6ff-4167-82e9-9a4804a008f6-config-data\") pod \"glance-db-sync-ntrlf\" (UID: \"7a00d229-c6ff-4167-82e9-9a4804a008f6\") " pod="glance-kuttl-tests/glance-db-sync-ntrlf" Oct 02 13:17:10 crc kubenswrapper[4724]: I1002 13:17:10.526839 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wqdzs\" (UniqueName: \"kubernetes.io/projected/7a00d229-c6ff-4167-82e9-9a4804a008f6-kube-api-access-wqdzs\") pod \"glance-db-sync-ntrlf\" (UID: \"7a00d229-c6ff-4167-82e9-9a4804a008f6\") " pod="glance-kuttl-tests/glance-db-sync-ntrlf" Oct 02 13:17:10 crc kubenswrapper[4724]: I1002 13:17:10.527030 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/7a00d229-c6ff-4167-82e9-9a4804a008f6-db-sync-config-data\") pod \"glance-db-sync-ntrlf\" (UID: \"7a00d229-c6ff-4167-82e9-9a4804a008f6\") " pod="glance-kuttl-tests/glance-db-sync-ntrlf" Oct 02 13:17:10 crc kubenswrapper[4724]: I1002 13:17:10.628414 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wqdzs\" (UniqueName: \"kubernetes.io/projected/7a00d229-c6ff-4167-82e9-9a4804a008f6-kube-api-access-wqdzs\") pod \"glance-db-sync-ntrlf\" (UID: \"7a00d229-c6ff-4167-82e9-9a4804a008f6\") " pod="glance-kuttl-tests/glance-db-sync-ntrlf" Oct 02 13:17:10 crc kubenswrapper[4724]: I1002 13:17:10.629323 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7a00d229-c6ff-4167-82e9-9a4804a008f6-config-data\") pod \"glance-db-sync-ntrlf\" (UID: \"7a00d229-c6ff-4167-82e9-9a4804a008f6\") " pod="glance-kuttl-tests/glance-db-sync-ntrlf" Oct 02 13:17:10 crc kubenswrapper[4724]: I1002 13:17:10.629448 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/7a00d229-c6ff-4167-82e9-9a4804a008f6-db-sync-config-data\") pod \"glance-db-sync-ntrlf\" (UID: \"7a00d229-c6ff-4167-82e9-9a4804a008f6\") " pod="glance-kuttl-tests/glance-db-sync-ntrlf" Oct 02 13:17:10 crc kubenswrapper[4724]: I1002 13:17:10.643507 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/7a00d229-c6ff-4167-82e9-9a4804a008f6-db-sync-config-data\") pod \"glance-db-sync-ntrlf\" (UID: \"7a00d229-c6ff-4167-82e9-9a4804a008f6\") " pod="glance-kuttl-tests/glance-db-sync-ntrlf" Oct 02 13:17:10 crc kubenswrapper[4724]: I1002 13:17:10.643804 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7a00d229-c6ff-4167-82e9-9a4804a008f6-config-data\") pod \"glance-db-sync-ntrlf\" (UID: \"7a00d229-c6ff-4167-82e9-9a4804a008f6\") " pod="glance-kuttl-tests/glance-db-sync-ntrlf" Oct 02 13:17:10 crc kubenswrapper[4724]: I1002 13:17:10.660718 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wqdzs\" (UniqueName: \"kubernetes.io/projected/7a00d229-c6ff-4167-82e9-9a4804a008f6-kube-api-access-wqdzs\") pod \"glance-db-sync-ntrlf\" (UID: \"7a00d229-c6ff-4167-82e9-9a4804a008f6\") " pod="glance-kuttl-tests/glance-db-sync-ntrlf" Oct 02 13:17:10 crc kubenswrapper[4724]: I1002 13:17:10.717075 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-db-sync-ntrlf" Oct 02 13:17:11 crc kubenswrapper[4724]: I1002 13:17:11.026132 4724 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-db-sync-ntrlf"] Oct 02 13:17:11 crc kubenswrapper[4724]: I1002 13:17:11.569986 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-db-sync-ntrlf" event={"ID":"7a00d229-c6ff-4167-82e9-9a4804a008f6","Type":"ContainerStarted","Data":"d518413b162f78f808f7f0bfdfe66701a39a99edb178e90cd171aed0055f4864"} Oct 02 13:17:25 crc kubenswrapper[4724]: I1002 13:17:25.704775 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-db-sync-ntrlf" event={"ID":"7a00d229-c6ff-4167-82e9-9a4804a008f6","Type":"ContainerStarted","Data":"29a33a00e43225a41a31bc5b7da3b75a51b4d974afce86f87c7f8f77cac9c8cd"} Oct 02 13:17:32 crc kubenswrapper[4724]: I1002 13:17:32.764639 4724 generic.go:334] "Generic (PLEG): container finished" podID="7a00d229-c6ff-4167-82e9-9a4804a008f6" containerID="29a33a00e43225a41a31bc5b7da3b75a51b4d974afce86f87c7f8f77cac9c8cd" exitCode=0 Oct 02 13:17:32 crc kubenswrapper[4724]: I1002 13:17:32.764800 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-db-sync-ntrlf" event={"ID":"7a00d229-c6ff-4167-82e9-9a4804a008f6","Type":"ContainerDied","Data":"29a33a00e43225a41a31bc5b7da3b75a51b4d974afce86f87c7f8f77cac9c8cd"} Oct 02 13:17:34 crc kubenswrapper[4724]: I1002 13:17:34.150247 4724 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-db-sync-ntrlf" Oct 02 13:17:34 crc kubenswrapper[4724]: I1002 13:17:34.246148 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wqdzs\" (UniqueName: \"kubernetes.io/projected/7a00d229-c6ff-4167-82e9-9a4804a008f6-kube-api-access-wqdzs\") pod \"7a00d229-c6ff-4167-82e9-9a4804a008f6\" (UID: \"7a00d229-c6ff-4167-82e9-9a4804a008f6\") " Oct 02 13:17:34 crc kubenswrapper[4724]: I1002 13:17:34.246301 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/7a00d229-c6ff-4167-82e9-9a4804a008f6-db-sync-config-data\") pod \"7a00d229-c6ff-4167-82e9-9a4804a008f6\" (UID: \"7a00d229-c6ff-4167-82e9-9a4804a008f6\") " Oct 02 13:17:34 crc kubenswrapper[4724]: I1002 13:17:34.246400 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7a00d229-c6ff-4167-82e9-9a4804a008f6-config-data\") pod \"7a00d229-c6ff-4167-82e9-9a4804a008f6\" (UID: \"7a00d229-c6ff-4167-82e9-9a4804a008f6\") " Oct 02 13:17:34 crc kubenswrapper[4724]: I1002 13:17:34.254019 4724 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7a00d229-c6ff-4167-82e9-9a4804a008f6-kube-api-access-wqdzs" (OuterVolumeSpecName: "kube-api-access-wqdzs") pod "7a00d229-c6ff-4167-82e9-9a4804a008f6" (UID: "7a00d229-c6ff-4167-82e9-9a4804a008f6"). InnerVolumeSpecName "kube-api-access-wqdzs". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 13:17:34 crc kubenswrapper[4724]: I1002 13:17:34.254435 4724 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7a00d229-c6ff-4167-82e9-9a4804a008f6-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "7a00d229-c6ff-4167-82e9-9a4804a008f6" (UID: "7a00d229-c6ff-4167-82e9-9a4804a008f6"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 13:17:34 crc kubenswrapper[4724]: I1002 13:17:34.292021 4724 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7a00d229-c6ff-4167-82e9-9a4804a008f6-config-data" (OuterVolumeSpecName: "config-data") pod "7a00d229-c6ff-4167-82e9-9a4804a008f6" (UID: "7a00d229-c6ff-4167-82e9-9a4804a008f6"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 13:17:34 crc kubenswrapper[4724]: I1002 13:17:34.348976 4724 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7a00d229-c6ff-4167-82e9-9a4804a008f6-config-data\") on node \"crc\" DevicePath \"\"" Oct 02 13:17:34 crc kubenswrapper[4724]: I1002 13:17:34.349024 4724 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wqdzs\" (UniqueName: \"kubernetes.io/projected/7a00d229-c6ff-4167-82e9-9a4804a008f6-kube-api-access-wqdzs\") on node \"crc\" DevicePath \"\"" Oct 02 13:17:34 crc kubenswrapper[4724]: I1002 13:17:34.349036 4724 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/7a00d229-c6ff-4167-82e9-9a4804a008f6-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Oct 02 13:17:34 crc kubenswrapper[4724]: I1002 13:17:34.785418 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-db-sync-ntrlf" event={"ID":"7a00d229-c6ff-4167-82e9-9a4804a008f6","Type":"ContainerDied","Data":"d518413b162f78f808f7f0bfdfe66701a39a99edb178e90cd171aed0055f4864"} Oct 02 13:17:34 crc kubenswrapper[4724]: I1002 13:17:34.785968 4724 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d518413b162f78f808f7f0bfdfe66701a39a99edb178e90cd171aed0055f4864" Oct 02 13:17:34 crc kubenswrapper[4724]: I1002 13:17:34.785505 4724 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-db-sync-ntrlf" Oct 02 13:17:36 crc kubenswrapper[4724]: I1002 13:17:36.137596 4724 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/glance-default-single-0"] Oct 02 13:17:36 crc kubenswrapper[4724]: E1002 13:17:36.138376 4724 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7a00d229-c6ff-4167-82e9-9a4804a008f6" containerName="glance-db-sync" Oct 02 13:17:36 crc kubenswrapper[4724]: I1002 13:17:36.138393 4724 state_mem.go:107] "Deleted CPUSet assignment" podUID="7a00d229-c6ff-4167-82e9-9a4804a008f6" containerName="glance-db-sync" Oct 02 13:17:36 crc kubenswrapper[4724]: I1002 13:17:36.138697 4724 memory_manager.go:354] "RemoveStaleState removing state" podUID="7a00d229-c6ff-4167-82e9-9a4804a008f6" containerName="glance-db-sync" Oct 02 13:17:36 crc kubenswrapper[4724]: I1002 13:17:36.143676 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-single-0" Oct 02 13:17:36 crc kubenswrapper[4724]: I1002 13:17:36.148684 4724 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"glance-scripts" Oct 02 13:17:36 crc kubenswrapper[4724]: I1002 13:17:36.149919 4724 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"glance-default-single-config-data" Oct 02 13:17:36 crc kubenswrapper[4724]: I1002 13:17:36.150234 4724 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"glance-glance-dockercfg-njbfn" Oct 02 13:17:36 crc kubenswrapper[4724]: I1002 13:17:36.162321 4724 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-default-single-0"] Oct 02 13:17:36 crc kubenswrapper[4724]: I1002 13:17:36.190630 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-single-0\" (UID: \"f7873fe0-aa1b-41d5-b31c-b5b6fec0bf0b\") " pod="glance-kuttl-tests/glance-default-single-0" Oct 02 13:17:36 crc kubenswrapper[4724]: I1002 13:17:36.192223 4724 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/glance-default-single-1"] Oct 02 13:17:36 crc kubenswrapper[4724]: I1002 13:17:36.193647 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-single-1" Oct 02 13:17:36 crc kubenswrapper[4724]: I1002 13:17:36.206125 4724 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-default-single-1"] Oct 02 13:17:36 crc kubenswrapper[4724]: I1002 13:17:36.292071 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/f7873fe0-aa1b-41d5-b31c-b5b6fec0bf0b-lib-modules\") pod \"glance-default-single-0\" (UID: \"f7873fe0-aa1b-41d5-b31c-b5b6fec0bf0b\") " pod="glance-kuttl-tests/glance-default-single-0" Oct 02 13:17:36 crc kubenswrapper[4724]: I1002 13:17:36.292526 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/f7873fe0-aa1b-41d5-b31c-b5b6fec0bf0b-etc-nvme\") pod \"glance-default-single-0\" (UID: \"f7873fe0-aa1b-41d5-b31c-b5b6fec0bf0b\") " pod="glance-kuttl-tests/glance-default-single-0" Oct 02 13:17:36 crc kubenswrapper[4724]: I1002 13:17:36.292646 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/f7873fe0-aa1b-41d5-b31c-b5b6fec0bf0b-dev\") pod \"glance-default-single-0\" (UID: \"f7873fe0-aa1b-41d5-b31c-b5b6fec0bf0b\") " pod="glance-kuttl-tests/glance-default-single-0" Oct 02 13:17:36 crc kubenswrapper[4724]: I1002 13:17:36.292670 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bpmwf\" (UniqueName: \"kubernetes.io/projected/f7873fe0-aa1b-41d5-b31c-b5b6fec0bf0b-kube-api-access-bpmwf\") pod \"glance-default-single-0\" (UID: \"f7873fe0-aa1b-41d5-b31c-b5b6fec0bf0b\") " pod="glance-kuttl-tests/glance-default-single-0" Oct 02 13:17:36 crc kubenswrapper[4724]: I1002 13:17:36.292887 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/f7873fe0-aa1b-41d5-b31c-b5b6fec0bf0b-sys\") pod \"glance-default-single-0\" (UID: \"f7873fe0-aa1b-41d5-b31c-b5b6fec0bf0b\") " pod="glance-kuttl-tests/glance-default-single-0" Oct 02 13:17:36 crc kubenswrapper[4724]: I1002 13:17:36.292986 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f7873fe0-aa1b-41d5-b31c-b5b6fec0bf0b-scripts\") pod \"glance-default-single-0\" (UID: \"f7873fe0-aa1b-41d5-b31c-b5b6fec0bf0b\") " pod="glance-kuttl-tests/glance-default-single-0" Oct 02 13:17:36 crc kubenswrapper[4724]: I1002 13:17:36.293031 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-single-0\" (UID: \"f7873fe0-aa1b-41d5-b31c-b5b6fec0bf0b\") " pod="glance-kuttl-tests/glance-default-single-0" Oct 02 13:17:36 crc kubenswrapper[4724]: I1002 13:17:36.293069 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/f7873fe0-aa1b-41d5-b31c-b5b6fec0bf0b-run\") pod \"glance-default-single-0\" (UID: \"f7873fe0-aa1b-41d5-b31c-b5b6fec0bf0b\") " pod="glance-kuttl-tests/glance-default-single-0" Oct 02 13:17:36 crc kubenswrapper[4724]: I1002 13:17:36.293133 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f7873fe0-aa1b-41d5-b31c-b5b6fec0bf0b-logs\") pod \"glance-default-single-0\" (UID: \"f7873fe0-aa1b-41d5-b31c-b5b6fec0bf0b\") " pod="glance-kuttl-tests/glance-default-single-0" Oct 02 13:17:36 crc kubenswrapper[4724]: I1002 13:17:36.293184 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/f7873fe0-aa1b-41d5-b31c-b5b6fec0bf0b-httpd-run\") pod \"glance-default-single-0\" (UID: \"f7873fe0-aa1b-41d5-b31c-b5b6fec0bf0b\") " pod="glance-kuttl-tests/glance-default-single-0" Oct 02 13:17:36 crc kubenswrapper[4724]: I1002 13:17:36.293263 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f7873fe0-aa1b-41d5-b31c-b5b6fec0bf0b-config-data\") pod \"glance-default-single-0\" (UID: \"f7873fe0-aa1b-41d5-b31c-b5b6fec0bf0b\") " pod="glance-kuttl-tests/glance-default-single-0" Oct 02 13:17:36 crc kubenswrapper[4724]: I1002 13:17:36.293465 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/f7873fe0-aa1b-41d5-b31c-b5b6fec0bf0b-etc-iscsi\") pod \"glance-default-single-0\" (UID: \"f7873fe0-aa1b-41d5-b31c-b5b6fec0bf0b\") " pod="glance-kuttl-tests/glance-default-single-0" Oct 02 13:17:36 crc kubenswrapper[4724]: I1002 13:17:36.293500 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/f7873fe0-aa1b-41d5-b31c-b5b6fec0bf0b-var-locks-brick\") pod \"glance-default-single-0\" (UID: \"f7873fe0-aa1b-41d5-b31c-b5b6fec0bf0b\") " pod="glance-kuttl-tests/glance-default-single-0" Oct 02 13:17:36 crc kubenswrapper[4724]: I1002 13:17:36.293591 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-single-0\" (UID: \"f7873fe0-aa1b-41d5-b31c-b5b6fec0bf0b\") " pod="glance-kuttl-tests/glance-default-single-0" Oct 02 13:17:36 crc kubenswrapper[4724]: I1002 13:17:36.294067 4724 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-single-0\" (UID: \"f7873fe0-aa1b-41d5-b31c-b5b6fec0bf0b\") device mount path \"/mnt/openstack/pv06\"" pod="glance-kuttl-tests/glance-default-single-0" Oct 02 13:17:36 crc kubenswrapper[4724]: I1002 13:17:36.322611 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-single-0\" (UID: \"f7873fe0-aa1b-41d5-b31c-b5b6fec0bf0b\") " pod="glance-kuttl-tests/glance-default-single-0" Oct 02 13:17:36 crc kubenswrapper[4724]: I1002 13:17:36.395186 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/f7873fe0-aa1b-41d5-b31c-b5b6fec0bf0b-var-locks-brick\") pod \"glance-default-single-0\" (UID: \"f7873fe0-aa1b-41d5-b31c-b5b6fec0bf0b\") " pod="glance-kuttl-tests/glance-default-single-0" Oct 02 13:17:36 crc kubenswrapper[4724]: I1002 13:17:36.395249 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/f7873fe0-aa1b-41d5-b31c-b5b6fec0bf0b-lib-modules\") pod \"glance-default-single-0\" (UID: \"f7873fe0-aa1b-41d5-b31c-b5b6fec0bf0b\") " pod="glance-kuttl-tests/glance-default-single-0" Oct 02 13:17:36 crc kubenswrapper[4724]: I1002 13:17:36.395274 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/f7873fe0-aa1b-41d5-b31c-b5b6fec0bf0b-etc-nvme\") pod \"glance-default-single-0\" (UID: \"f7873fe0-aa1b-41d5-b31c-b5b6fec0bf0b\") " pod="glance-kuttl-tests/glance-default-single-0" Oct 02 13:17:36 crc kubenswrapper[4724]: I1002 13:17:36.395304 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/b87d64c2-8321-4eb3-a16d-ef89959dd571-sys\") pod \"glance-default-single-1\" (UID: \"b87d64c2-8321-4eb3-a16d-ef89959dd571\") " pod="glance-kuttl-tests/glance-default-single-1" Oct 02 13:17:36 crc kubenswrapper[4724]: I1002 13:17:36.395329 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-single-1\" (UID: \"b87d64c2-8321-4eb3-a16d-ef89959dd571\") " pod="glance-kuttl-tests/glance-default-single-1" Oct 02 13:17:36 crc kubenswrapper[4724]: I1002 13:17:36.395352 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/b87d64c2-8321-4eb3-a16d-ef89959dd571-var-locks-brick\") pod \"glance-default-single-1\" (UID: \"b87d64c2-8321-4eb3-a16d-ef89959dd571\") " pod="glance-kuttl-tests/glance-default-single-1" Oct 02 13:17:36 crc kubenswrapper[4724]: I1002 13:17:36.395373 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/b87d64c2-8321-4eb3-a16d-ef89959dd571-httpd-run\") pod \"glance-default-single-1\" (UID: \"b87d64c2-8321-4eb3-a16d-ef89959dd571\") " pod="glance-kuttl-tests/glance-default-single-1" Oct 02 13:17:36 crc kubenswrapper[4724]: I1002 13:17:36.395394 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/f7873fe0-aa1b-41d5-b31c-b5b6fec0bf0b-dev\") pod \"glance-default-single-0\" (UID: \"f7873fe0-aa1b-41d5-b31c-b5b6fec0bf0b\") " pod="glance-kuttl-tests/glance-default-single-0" Oct 02 13:17:36 crc kubenswrapper[4724]: I1002 13:17:36.395399 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/f7873fe0-aa1b-41d5-b31c-b5b6fec0bf0b-lib-modules\") pod \"glance-default-single-0\" (UID: \"f7873fe0-aa1b-41d5-b31c-b5b6fec0bf0b\") " pod="glance-kuttl-tests/glance-default-single-0" Oct 02 13:17:36 crc kubenswrapper[4724]: I1002 13:17:36.395413 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/b87d64c2-8321-4eb3-a16d-ef89959dd571-lib-modules\") pod \"glance-default-single-1\" (UID: \"b87d64c2-8321-4eb3-a16d-ef89959dd571\") " pod="glance-kuttl-tests/glance-default-single-1" Oct 02 13:17:36 crc kubenswrapper[4724]: I1002 13:17:36.395593 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bpmwf\" (UniqueName: \"kubernetes.io/projected/f7873fe0-aa1b-41d5-b31c-b5b6fec0bf0b-kube-api-access-bpmwf\") pod \"glance-default-single-0\" (UID: \"f7873fe0-aa1b-41d5-b31c-b5b6fec0bf0b\") " pod="glance-kuttl-tests/glance-default-single-0" Oct 02 13:17:36 crc kubenswrapper[4724]: I1002 13:17:36.395607 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/f7873fe0-aa1b-41d5-b31c-b5b6fec0bf0b-etc-nvme\") pod \"glance-default-single-0\" (UID: \"f7873fe0-aa1b-41d5-b31c-b5b6fec0bf0b\") " pod="glance-kuttl-tests/glance-default-single-0" Oct 02 13:17:36 crc kubenswrapper[4724]: I1002 13:17:36.395607 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/f7873fe0-aa1b-41d5-b31c-b5b6fec0bf0b-var-locks-brick\") pod \"glance-default-single-0\" (UID: \"f7873fe0-aa1b-41d5-b31c-b5b6fec0bf0b\") " pod="glance-kuttl-tests/glance-default-single-0" Oct 02 13:17:36 crc kubenswrapper[4724]: I1002 13:17:36.395627 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/b87d64c2-8321-4eb3-a16d-ef89959dd571-run\") pod \"glance-default-single-1\" (UID: \"b87d64c2-8321-4eb3-a16d-ef89959dd571\") " pod="glance-kuttl-tests/glance-default-single-1" Oct 02 13:17:36 crc kubenswrapper[4724]: I1002 13:17:36.395748 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/f7873fe0-aa1b-41d5-b31c-b5b6fec0bf0b-dev\") pod \"glance-default-single-0\" (UID: \"f7873fe0-aa1b-41d5-b31c-b5b6fec0bf0b\") " pod="glance-kuttl-tests/glance-default-single-0" Oct 02 13:17:36 crc kubenswrapper[4724]: I1002 13:17:36.395896 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b87d64c2-8321-4eb3-a16d-ef89959dd571-logs\") pod \"glance-default-single-1\" (UID: \"b87d64c2-8321-4eb3-a16d-ef89959dd571\") " pod="glance-kuttl-tests/glance-default-single-1" Oct 02 13:17:36 crc kubenswrapper[4724]: I1002 13:17:36.396056 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b87d64c2-8321-4eb3-a16d-ef89959dd571-config-data\") pod \"glance-default-single-1\" (UID: \"b87d64c2-8321-4eb3-a16d-ef89959dd571\") " pod="glance-kuttl-tests/glance-default-single-1" Oct 02 13:17:36 crc kubenswrapper[4724]: I1002 13:17:36.396115 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/f7873fe0-aa1b-41d5-b31c-b5b6fec0bf0b-sys\") pod \"glance-default-single-0\" (UID: \"f7873fe0-aa1b-41d5-b31c-b5b6fec0bf0b\") " pod="glance-kuttl-tests/glance-default-single-0" Oct 02 13:17:36 crc kubenswrapper[4724]: I1002 13:17:36.396190 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/f7873fe0-aa1b-41d5-b31c-b5b6fec0bf0b-sys\") pod \"glance-default-single-0\" (UID: \"f7873fe0-aa1b-41d5-b31c-b5b6fec0bf0b\") " pod="glance-kuttl-tests/glance-default-single-0" Oct 02 13:17:36 crc kubenswrapper[4724]: I1002 13:17:36.396216 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f7873fe0-aa1b-41d5-b31c-b5b6fec0bf0b-scripts\") pod \"glance-default-single-0\" (UID: \"f7873fe0-aa1b-41d5-b31c-b5b6fec0bf0b\") " pod="glance-kuttl-tests/glance-default-single-0" Oct 02 13:17:36 crc kubenswrapper[4724]: I1002 13:17:36.396260 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b87d64c2-8321-4eb3-a16d-ef89959dd571-scripts\") pod \"glance-default-single-1\" (UID: \"b87d64c2-8321-4eb3-a16d-ef89959dd571\") " pod="glance-kuttl-tests/glance-default-single-1" Oct 02 13:17:36 crc kubenswrapper[4724]: I1002 13:17:36.396290 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-single-0\" (UID: \"f7873fe0-aa1b-41d5-b31c-b5b6fec0bf0b\") " pod="glance-kuttl-tests/glance-default-single-0" Oct 02 13:17:36 crc kubenswrapper[4724]: I1002 13:17:36.396388 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/f7873fe0-aa1b-41d5-b31c-b5b6fec0bf0b-run\") pod \"glance-default-single-0\" (UID: \"f7873fe0-aa1b-41d5-b31c-b5b6fec0bf0b\") " pod="glance-kuttl-tests/glance-default-single-0" Oct 02 13:17:36 crc kubenswrapper[4724]: I1002 13:17:36.396406 4724 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-single-0\" (UID: \"f7873fe0-aa1b-41d5-b31c-b5b6fec0bf0b\") device mount path \"/mnt/openstack/pv09\"" pod="glance-kuttl-tests/glance-default-single-0" Oct 02 13:17:36 crc kubenswrapper[4724]: I1002 13:17:36.396431 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/f7873fe0-aa1b-41d5-b31c-b5b6fec0bf0b-run\") pod \"glance-default-single-0\" (UID: \"f7873fe0-aa1b-41d5-b31c-b5b6fec0bf0b\") " pod="glance-kuttl-tests/glance-default-single-0" Oct 02 13:17:36 crc kubenswrapper[4724]: I1002 13:17:36.396493 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-single-1\" (UID: \"b87d64c2-8321-4eb3-a16d-ef89959dd571\") " pod="glance-kuttl-tests/glance-default-single-1" Oct 02 13:17:36 crc kubenswrapper[4724]: I1002 13:17:36.396566 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f7873fe0-aa1b-41d5-b31c-b5b6fec0bf0b-logs\") pod \"glance-default-single-0\" (UID: \"f7873fe0-aa1b-41d5-b31c-b5b6fec0bf0b\") " pod="glance-kuttl-tests/glance-default-single-0" Oct 02 13:17:36 crc kubenswrapper[4724]: I1002 13:17:36.396593 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/f7873fe0-aa1b-41d5-b31c-b5b6fec0bf0b-httpd-run\") pod \"glance-default-single-0\" (UID: \"f7873fe0-aa1b-41d5-b31c-b5b6fec0bf0b\") " pod="glance-kuttl-tests/glance-default-single-0" Oct 02 13:17:36 crc kubenswrapper[4724]: I1002 13:17:36.396621 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/b87d64c2-8321-4eb3-a16d-ef89959dd571-etc-iscsi\") pod \"glance-default-single-1\" (UID: \"b87d64c2-8321-4eb3-a16d-ef89959dd571\") " pod="glance-kuttl-tests/glance-default-single-1" Oct 02 13:17:36 crc kubenswrapper[4724]: I1002 13:17:36.396714 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f7873fe0-aa1b-41d5-b31c-b5b6fec0bf0b-config-data\") pod \"glance-default-single-0\" (UID: \"f7873fe0-aa1b-41d5-b31c-b5b6fec0bf0b\") " pod="glance-kuttl-tests/glance-default-single-0" Oct 02 13:17:36 crc kubenswrapper[4724]: I1002 13:17:36.396793 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6tzpv\" (UniqueName: \"kubernetes.io/projected/b87d64c2-8321-4eb3-a16d-ef89959dd571-kube-api-access-6tzpv\") pod \"glance-default-single-1\" (UID: \"b87d64c2-8321-4eb3-a16d-ef89959dd571\") " pod="glance-kuttl-tests/glance-default-single-1" Oct 02 13:17:36 crc kubenswrapper[4724]: I1002 13:17:36.396890 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/b87d64c2-8321-4eb3-a16d-ef89959dd571-dev\") pod \"glance-default-single-1\" (UID: \"b87d64c2-8321-4eb3-a16d-ef89959dd571\") " pod="glance-kuttl-tests/glance-default-single-1" Oct 02 13:17:36 crc kubenswrapper[4724]: I1002 13:17:36.397029 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/b87d64c2-8321-4eb3-a16d-ef89959dd571-etc-nvme\") pod \"glance-default-single-1\" (UID: \"b87d64c2-8321-4eb3-a16d-ef89959dd571\") " pod="glance-kuttl-tests/glance-default-single-1" Oct 02 13:17:36 crc kubenswrapper[4724]: I1002 13:17:36.397168 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/f7873fe0-aa1b-41d5-b31c-b5b6fec0bf0b-etc-iscsi\") pod \"glance-default-single-0\" (UID: \"f7873fe0-aa1b-41d5-b31c-b5b6fec0bf0b\") " pod="glance-kuttl-tests/glance-default-single-0" Oct 02 13:17:36 crc kubenswrapper[4724]: I1002 13:17:36.397180 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f7873fe0-aa1b-41d5-b31c-b5b6fec0bf0b-logs\") pod \"glance-default-single-0\" (UID: \"f7873fe0-aa1b-41d5-b31c-b5b6fec0bf0b\") " pod="glance-kuttl-tests/glance-default-single-0" Oct 02 13:17:36 crc kubenswrapper[4724]: I1002 13:17:36.397200 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/f7873fe0-aa1b-41d5-b31c-b5b6fec0bf0b-httpd-run\") pod \"glance-default-single-0\" (UID: \"f7873fe0-aa1b-41d5-b31c-b5b6fec0bf0b\") " pod="glance-kuttl-tests/glance-default-single-0" Oct 02 13:17:36 crc kubenswrapper[4724]: I1002 13:17:36.397330 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/f7873fe0-aa1b-41d5-b31c-b5b6fec0bf0b-etc-iscsi\") pod \"glance-default-single-0\" (UID: \"f7873fe0-aa1b-41d5-b31c-b5b6fec0bf0b\") " pod="glance-kuttl-tests/glance-default-single-0" Oct 02 13:17:36 crc kubenswrapper[4724]: I1002 13:17:36.400811 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f7873fe0-aa1b-41d5-b31c-b5b6fec0bf0b-scripts\") pod \"glance-default-single-0\" (UID: \"f7873fe0-aa1b-41d5-b31c-b5b6fec0bf0b\") " pod="glance-kuttl-tests/glance-default-single-0" Oct 02 13:17:36 crc kubenswrapper[4724]: I1002 13:17:36.401731 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f7873fe0-aa1b-41d5-b31c-b5b6fec0bf0b-config-data\") pod \"glance-default-single-0\" (UID: \"f7873fe0-aa1b-41d5-b31c-b5b6fec0bf0b\") " pod="glance-kuttl-tests/glance-default-single-0" Oct 02 13:17:36 crc kubenswrapper[4724]: I1002 13:17:36.416054 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bpmwf\" (UniqueName: \"kubernetes.io/projected/f7873fe0-aa1b-41d5-b31c-b5b6fec0bf0b-kube-api-access-bpmwf\") pod \"glance-default-single-0\" (UID: \"f7873fe0-aa1b-41d5-b31c-b5b6fec0bf0b\") " pod="glance-kuttl-tests/glance-default-single-0" Oct 02 13:17:36 crc kubenswrapper[4724]: I1002 13:17:36.420289 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-single-0\" (UID: \"f7873fe0-aa1b-41d5-b31c-b5b6fec0bf0b\") " pod="glance-kuttl-tests/glance-default-single-0" Oct 02 13:17:36 crc kubenswrapper[4724]: I1002 13:17:36.499315 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/b87d64c2-8321-4eb3-a16d-ef89959dd571-sys\") pod \"glance-default-single-1\" (UID: \"b87d64c2-8321-4eb3-a16d-ef89959dd571\") " pod="glance-kuttl-tests/glance-default-single-1" Oct 02 13:17:36 crc kubenswrapper[4724]: I1002 13:17:36.499374 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-single-1\" (UID: \"b87d64c2-8321-4eb3-a16d-ef89959dd571\") " pod="glance-kuttl-tests/glance-default-single-1" Oct 02 13:17:36 crc kubenswrapper[4724]: I1002 13:17:36.499401 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/b87d64c2-8321-4eb3-a16d-ef89959dd571-var-locks-brick\") pod \"glance-default-single-1\" (UID: \"b87d64c2-8321-4eb3-a16d-ef89959dd571\") " pod="glance-kuttl-tests/glance-default-single-1" Oct 02 13:17:36 crc kubenswrapper[4724]: I1002 13:17:36.499420 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/b87d64c2-8321-4eb3-a16d-ef89959dd571-httpd-run\") pod \"glance-default-single-1\" (UID: \"b87d64c2-8321-4eb3-a16d-ef89959dd571\") " pod="glance-kuttl-tests/glance-default-single-1" Oct 02 13:17:36 crc kubenswrapper[4724]: I1002 13:17:36.499440 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/b87d64c2-8321-4eb3-a16d-ef89959dd571-lib-modules\") pod \"glance-default-single-1\" (UID: \"b87d64c2-8321-4eb3-a16d-ef89959dd571\") " pod="glance-kuttl-tests/glance-default-single-1" Oct 02 13:17:36 crc kubenswrapper[4724]: I1002 13:17:36.499458 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/b87d64c2-8321-4eb3-a16d-ef89959dd571-run\") pod \"glance-default-single-1\" (UID: \"b87d64c2-8321-4eb3-a16d-ef89959dd571\") " pod="glance-kuttl-tests/glance-default-single-1" Oct 02 13:17:36 crc kubenswrapper[4724]: I1002 13:17:36.499476 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b87d64c2-8321-4eb3-a16d-ef89959dd571-logs\") pod \"glance-default-single-1\" (UID: \"b87d64c2-8321-4eb3-a16d-ef89959dd571\") " pod="glance-kuttl-tests/glance-default-single-1" Oct 02 13:17:36 crc kubenswrapper[4724]: I1002 13:17:36.499505 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b87d64c2-8321-4eb3-a16d-ef89959dd571-config-data\") pod \"glance-default-single-1\" (UID: \"b87d64c2-8321-4eb3-a16d-ef89959dd571\") " pod="glance-kuttl-tests/glance-default-single-1" Oct 02 13:17:36 crc kubenswrapper[4724]: I1002 13:17:36.499559 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b87d64c2-8321-4eb3-a16d-ef89959dd571-scripts\") pod \"glance-default-single-1\" (UID: \"b87d64c2-8321-4eb3-a16d-ef89959dd571\") " pod="glance-kuttl-tests/glance-default-single-1" Oct 02 13:17:36 crc kubenswrapper[4724]: I1002 13:17:36.499589 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-single-1\" (UID: \"b87d64c2-8321-4eb3-a16d-ef89959dd571\") " pod="glance-kuttl-tests/glance-default-single-1" Oct 02 13:17:36 crc kubenswrapper[4724]: I1002 13:17:36.499614 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/b87d64c2-8321-4eb3-a16d-ef89959dd571-etc-iscsi\") pod \"glance-default-single-1\" (UID: \"b87d64c2-8321-4eb3-a16d-ef89959dd571\") " pod="glance-kuttl-tests/glance-default-single-1" Oct 02 13:17:36 crc kubenswrapper[4724]: I1002 13:17:36.499639 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6tzpv\" (UniqueName: \"kubernetes.io/projected/b87d64c2-8321-4eb3-a16d-ef89959dd571-kube-api-access-6tzpv\") pod \"glance-default-single-1\" (UID: \"b87d64c2-8321-4eb3-a16d-ef89959dd571\") " pod="glance-kuttl-tests/glance-default-single-1" Oct 02 13:17:36 crc kubenswrapper[4724]: I1002 13:17:36.499655 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/b87d64c2-8321-4eb3-a16d-ef89959dd571-dev\") pod \"glance-default-single-1\" (UID: \"b87d64c2-8321-4eb3-a16d-ef89959dd571\") " pod="glance-kuttl-tests/glance-default-single-1" Oct 02 13:17:36 crc kubenswrapper[4724]: I1002 13:17:36.499676 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/b87d64c2-8321-4eb3-a16d-ef89959dd571-etc-nvme\") pod \"glance-default-single-1\" (UID: \"b87d64c2-8321-4eb3-a16d-ef89959dd571\") " pod="glance-kuttl-tests/glance-default-single-1" Oct 02 13:17:36 crc kubenswrapper[4724]: I1002 13:17:36.499723 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/b87d64c2-8321-4eb3-a16d-ef89959dd571-run\") pod \"glance-default-single-1\" (UID: \"b87d64c2-8321-4eb3-a16d-ef89959dd571\") " pod="glance-kuttl-tests/glance-default-single-1" Oct 02 13:17:36 crc kubenswrapper[4724]: I1002 13:17:36.499762 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/b87d64c2-8321-4eb3-a16d-ef89959dd571-etc-nvme\") pod \"glance-default-single-1\" (UID: \"b87d64c2-8321-4eb3-a16d-ef89959dd571\") " pod="glance-kuttl-tests/glance-default-single-1" Oct 02 13:17:36 crc kubenswrapper[4724]: I1002 13:17:36.499793 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/b87d64c2-8321-4eb3-a16d-ef89959dd571-etc-iscsi\") pod \"glance-default-single-1\" (UID: \"b87d64c2-8321-4eb3-a16d-ef89959dd571\") " pod="glance-kuttl-tests/glance-default-single-1" Oct 02 13:17:36 crc kubenswrapper[4724]: I1002 13:17:36.499841 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/b87d64c2-8321-4eb3-a16d-ef89959dd571-var-locks-brick\") pod \"glance-default-single-1\" (UID: \"b87d64c2-8321-4eb3-a16d-ef89959dd571\") " pod="glance-kuttl-tests/glance-default-single-1" Oct 02 13:17:36 crc kubenswrapper[4724]: I1002 13:17:36.499880 4724 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-single-1\" (UID: \"b87d64c2-8321-4eb3-a16d-ef89959dd571\") device mount path \"/mnt/openstack/pv10\"" pod="glance-kuttl-tests/glance-default-single-1" Oct 02 13:17:36 crc kubenswrapper[4724]: I1002 13:17:36.499903 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/b87d64c2-8321-4eb3-a16d-ef89959dd571-lib-modules\") pod \"glance-default-single-1\" (UID: \"b87d64c2-8321-4eb3-a16d-ef89959dd571\") " pod="glance-kuttl-tests/glance-default-single-1" Oct 02 13:17:36 crc kubenswrapper[4724]: I1002 13:17:36.499901 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/b87d64c2-8321-4eb3-a16d-ef89959dd571-sys\") pod \"glance-default-single-1\" (UID: \"b87d64c2-8321-4eb3-a16d-ef89959dd571\") " pod="glance-kuttl-tests/glance-default-single-1" Oct 02 13:17:36 crc kubenswrapper[4724]: I1002 13:17:36.500135 4724 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-single-1\" (UID: \"b87d64c2-8321-4eb3-a16d-ef89959dd571\") device mount path \"/mnt/openstack/pv08\"" pod="glance-kuttl-tests/glance-default-single-1" Oct 02 13:17:36 crc kubenswrapper[4724]: I1002 13:17:36.500173 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/b87d64c2-8321-4eb3-a16d-ef89959dd571-dev\") pod \"glance-default-single-1\" (UID: \"b87d64c2-8321-4eb3-a16d-ef89959dd571\") " pod="glance-kuttl-tests/glance-default-single-1" Oct 02 13:17:36 crc kubenswrapper[4724]: I1002 13:17:36.500383 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/b87d64c2-8321-4eb3-a16d-ef89959dd571-httpd-run\") pod \"glance-default-single-1\" (UID: \"b87d64c2-8321-4eb3-a16d-ef89959dd571\") " pod="glance-kuttl-tests/glance-default-single-1" Oct 02 13:17:36 crc kubenswrapper[4724]: I1002 13:17:36.500406 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b87d64c2-8321-4eb3-a16d-ef89959dd571-logs\") pod \"glance-default-single-1\" (UID: \"b87d64c2-8321-4eb3-a16d-ef89959dd571\") " pod="glance-kuttl-tests/glance-default-single-1" Oct 02 13:17:36 crc kubenswrapper[4724]: I1002 13:17:36.506247 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b87d64c2-8321-4eb3-a16d-ef89959dd571-config-data\") pod \"glance-default-single-1\" (UID: \"b87d64c2-8321-4eb3-a16d-ef89959dd571\") " pod="glance-kuttl-tests/glance-default-single-1" Oct 02 13:17:36 crc kubenswrapper[4724]: I1002 13:17:36.506526 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b87d64c2-8321-4eb3-a16d-ef89959dd571-scripts\") pod \"glance-default-single-1\" (UID: \"b87d64c2-8321-4eb3-a16d-ef89959dd571\") " pod="glance-kuttl-tests/glance-default-single-1" Oct 02 13:17:36 crc kubenswrapper[4724]: I1002 13:17:36.513168 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-single-0" Oct 02 13:17:36 crc kubenswrapper[4724]: I1002 13:17:36.522216 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6tzpv\" (UniqueName: \"kubernetes.io/projected/b87d64c2-8321-4eb3-a16d-ef89959dd571-kube-api-access-6tzpv\") pod \"glance-default-single-1\" (UID: \"b87d64c2-8321-4eb3-a16d-ef89959dd571\") " pod="glance-kuttl-tests/glance-default-single-1" Oct 02 13:17:36 crc kubenswrapper[4724]: I1002 13:17:36.525825 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-single-1\" (UID: \"b87d64c2-8321-4eb3-a16d-ef89959dd571\") " pod="glance-kuttl-tests/glance-default-single-1" Oct 02 13:17:36 crc kubenswrapper[4724]: I1002 13:17:36.531391 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-single-1\" (UID: \"b87d64c2-8321-4eb3-a16d-ef89959dd571\") " pod="glance-kuttl-tests/glance-default-single-1" Oct 02 13:17:36 crc kubenswrapper[4724]: I1002 13:17:36.605521 4724 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-default-single-1"] Oct 02 13:17:36 crc kubenswrapper[4724]: I1002 13:17:36.606002 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-single-1" Oct 02 13:17:37 crc kubenswrapper[4724]: I1002 13:17:37.058128 4724 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-default-single-0"] Oct 02 13:17:37 crc kubenswrapper[4724]: I1002 13:17:37.146490 4724 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-default-single-1"] Oct 02 13:17:37 crc kubenswrapper[4724]: W1002 13:17:37.151315 4724 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb87d64c2_8321_4eb3_a16d_ef89959dd571.slice/crio-ec93ee67ad1a83512dcacaefa29d1b81e1a4aaf18183ac195878ccd6e437ffc0 WatchSource:0}: Error finding container ec93ee67ad1a83512dcacaefa29d1b81e1a4aaf18183ac195878ccd6e437ffc0: Status 404 returned error can't find the container with id ec93ee67ad1a83512dcacaefa29d1b81e1a4aaf18183ac195878ccd6e437ffc0 Oct 02 13:17:37 crc kubenswrapper[4724]: I1002 13:17:37.810442 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-1" event={"ID":"b87d64c2-8321-4eb3-a16d-ef89959dd571","Type":"ContainerStarted","Data":"21b624104bfe0d8c6c14fe5800d6f5e9eac839826651e28026a5c40640d53781"} Oct 02 13:17:37 crc kubenswrapper[4724]: I1002 13:17:37.811102 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-1" event={"ID":"b87d64c2-8321-4eb3-a16d-ef89959dd571","Type":"ContainerStarted","Data":"eb0002c032839563c4ebef50b4dcfff6730c299d0ff0789c47ecfdab3e44d453"} Oct 02 13:17:37 crc kubenswrapper[4724]: I1002 13:17:37.811121 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-1" event={"ID":"b87d64c2-8321-4eb3-a16d-ef89959dd571","Type":"ContainerStarted","Data":"ec93ee67ad1a83512dcacaefa29d1b81e1a4aaf18183ac195878ccd6e437ffc0"} Oct 02 13:17:37 crc kubenswrapper[4724]: I1002 13:17:37.810593 4724 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-single-1" podUID="b87d64c2-8321-4eb3-a16d-ef89959dd571" containerName="glance-httpd" containerID="cri-o://21b624104bfe0d8c6c14fe5800d6f5e9eac839826651e28026a5c40640d53781" gracePeriod=30 Oct 02 13:17:37 crc kubenswrapper[4724]: I1002 13:17:37.810506 4724 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-single-1" podUID="b87d64c2-8321-4eb3-a16d-ef89959dd571" containerName="glance-log" containerID="cri-o://eb0002c032839563c4ebef50b4dcfff6730c299d0ff0789c47ecfdab3e44d453" gracePeriod=30 Oct 02 13:17:37 crc kubenswrapper[4724]: I1002 13:17:37.823053 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-0" event={"ID":"f7873fe0-aa1b-41d5-b31c-b5b6fec0bf0b","Type":"ContainerStarted","Data":"c7e82767fb4d4c2a485ec79187036790c22923814388094356dd4023140286d5"} Oct 02 13:17:37 crc kubenswrapper[4724]: I1002 13:17:37.823134 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-0" event={"ID":"f7873fe0-aa1b-41d5-b31c-b5b6fec0bf0b","Type":"ContainerStarted","Data":"6624b3f6bb19e1abf0ab4a7c63c15dee9044e0598becc245d10c7f7da068ea98"} Oct 02 13:17:37 crc kubenswrapper[4724]: I1002 13:17:37.823150 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-0" event={"ID":"f7873fe0-aa1b-41d5-b31c-b5b6fec0bf0b","Type":"ContainerStarted","Data":"b03b039f8c2e8fbca7ab1b1ba595223852e8ed2ce1652f5f28505a4fdc24ce82"} Oct 02 13:17:37 crc kubenswrapper[4724]: I1002 13:17:37.851084 4724 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="glance-kuttl-tests/glance-default-single-1" podStartSLOduration=2.851056347 podStartE2EDuration="2.851056347s" podCreationTimestamp="2025-10-02 13:17:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 13:17:37.840192444 +0000 UTC m=+1122.294951605" watchObservedRunningTime="2025-10-02 13:17:37.851056347 +0000 UTC m=+1122.305815468" Oct 02 13:17:38 crc kubenswrapper[4724]: I1002 13:17:38.204050 4724 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-single-1" Oct 02 13:17:38 crc kubenswrapper[4724]: I1002 13:17:38.240740 4724 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="glance-kuttl-tests/glance-default-single-0" podStartSLOduration=3.236992243 podStartE2EDuration="3.236992243s" podCreationTimestamp="2025-10-02 13:17:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 13:17:37.881610852 +0000 UTC m=+1122.336369983" watchObservedRunningTime="2025-10-02 13:17:38.236992243 +0000 UTC m=+1122.691751364" Oct 02 13:17:38 crc kubenswrapper[4724]: I1002 13:17:38.336245 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/b87d64c2-8321-4eb3-a16d-ef89959dd571-etc-nvme\") pod \"b87d64c2-8321-4eb3-a16d-ef89959dd571\" (UID: \"b87d64c2-8321-4eb3-a16d-ef89959dd571\") " Oct 02 13:17:38 crc kubenswrapper[4724]: I1002 13:17:38.336306 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/b87d64c2-8321-4eb3-a16d-ef89959dd571-lib-modules\") pod \"b87d64c2-8321-4eb3-a16d-ef89959dd571\" (UID: \"b87d64c2-8321-4eb3-a16d-ef89959dd571\") " Oct 02 13:17:38 crc kubenswrapper[4724]: I1002 13:17:38.336329 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/b87d64c2-8321-4eb3-a16d-ef89959dd571-dev\") pod \"b87d64c2-8321-4eb3-a16d-ef89959dd571\" (UID: \"b87d64c2-8321-4eb3-a16d-ef89959dd571\") " Oct 02 13:17:38 crc kubenswrapper[4724]: I1002 13:17:38.336367 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b87d64c2-8321-4eb3-a16d-ef89959dd571-logs\") pod \"b87d64c2-8321-4eb3-a16d-ef89959dd571\" (UID: \"b87d64c2-8321-4eb3-a16d-ef89959dd571\") " Oct 02 13:17:38 crc kubenswrapper[4724]: I1002 13:17:38.336411 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"b87d64c2-8321-4eb3-a16d-ef89959dd571\" (UID: \"b87d64c2-8321-4eb3-a16d-ef89959dd571\") " Oct 02 13:17:38 crc kubenswrapper[4724]: I1002 13:17:38.336408 4724 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b87d64c2-8321-4eb3-a16d-ef89959dd571-etc-nvme" (OuterVolumeSpecName: "etc-nvme") pod "b87d64c2-8321-4eb3-a16d-ef89959dd571" (UID: "b87d64c2-8321-4eb3-a16d-ef89959dd571"). InnerVolumeSpecName "etc-nvme". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 02 13:17:38 crc kubenswrapper[4724]: I1002 13:17:38.336430 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b87d64c2-8321-4eb3-a16d-ef89959dd571-scripts\") pod \"b87d64c2-8321-4eb3-a16d-ef89959dd571\" (UID: \"b87d64c2-8321-4eb3-a16d-ef89959dd571\") " Oct 02 13:17:38 crc kubenswrapper[4724]: I1002 13:17:38.336457 4724 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b87d64c2-8321-4eb3-a16d-ef89959dd571-dev" (OuterVolumeSpecName: "dev") pod "b87d64c2-8321-4eb3-a16d-ef89959dd571" (UID: "b87d64c2-8321-4eb3-a16d-ef89959dd571"). InnerVolumeSpecName "dev". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 02 13:17:38 crc kubenswrapper[4724]: I1002 13:17:38.336482 4724 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b87d64c2-8321-4eb3-a16d-ef89959dd571-lib-modules" (OuterVolumeSpecName: "lib-modules") pod "b87d64c2-8321-4eb3-a16d-ef89959dd571" (UID: "b87d64c2-8321-4eb3-a16d-ef89959dd571"). InnerVolumeSpecName "lib-modules". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 02 13:17:38 crc kubenswrapper[4724]: I1002 13:17:38.336522 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b87d64c2-8321-4eb3-a16d-ef89959dd571-config-data\") pod \"b87d64c2-8321-4eb3-a16d-ef89959dd571\" (UID: \"b87d64c2-8321-4eb3-a16d-ef89959dd571\") " Oct 02 13:17:38 crc kubenswrapper[4724]: I1002 13:17:38.336570 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/b87d64c2-8321-4eb3-a16d-ef89959dd571-var-locks-brick\") pod \"b87d64c2-8321-4eb3-a16d-ef89959dd571\" (UID: \"b87d64c2-8321-4eb3-a16d-ef89959dd571\") " Oct 02 13:17:38 crc kubenswrapper[4724]: I1002 13:17:38.336592 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6tzpv\" (UniqueName: \"kubernetes.io/projected/b87d64c2-8321-4eb3-a16d-ef89959dd571-kube-api-access-6tzpv\") pod \"b87d64c2-8321-4eb3-a16d-ef89959dd571\" (UID: \"b87d64c2-8321-4eb3-a16d-ef89959dd571\") " Oct 02 13:17:38 crc kubenswrapper[4724]: I1002 13:17:38.336612 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/b87d64c2-8321-4eb3-a16d-ef89959dd571-run\") pod \"b87d64c2-8321-4eb3-a16d-ef89959dd571\" (UID: \"b87d64c2-8321-4eb3-a16d-ef89959dd571\") " Oct 02 13:17:38 crc kubenswrapper[4724]: I1002 13:17:38.336635 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/b87d64c2-8321-4eb3-a16d-ef89959dd571-sys\") pod \"b87d64c2-8321-4eb3-a16d-ef89959dd571\" (UID: \"b87d64c2-8321-4eb3-a16d-ef89959dd571\") " Oct 02 13:17:38 crc kubenswrapper[4724]: I1002 13:17:38.336662 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/b87d64c2-8321-4eb3-a16d-ef89959dd571-httpd-run\") pod \"b87d64c2-8321-4eb3-a16d-ef89959dd571\" (UID: \"b87d64c2-8321-4eb3-a16d-ef89959dd571\") " Oct 02 13:17:38 crc kubenswrapper[4724]: I1002 13:17:38.336699 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance-cache\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"b87d64c2-8321-4eb3-a16d-ef89959dd571\" (UID: \"b87d64c2-8321-4eb3-a16d-ef89959dd571\") " Oct 02 13:17:38 crc kubenswrapper[4724]: I1002 13:17:38.336745 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/b87d64c2-8321-4eb3-a16d-ef89959dd571-etc-iscsi\") pod \"b87d64c2-8321-4eb3-a16d-ef89959dd571\" (UID: \"b87d64c2-8321-4eb3-a16d-ef89959dd571\") " Oct 02 13:17:38 crc kubenswrapper[4724]: I1002 13:17:38.336868 4724 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b87d64c2-8321-4eb3-a16d-ef89959dd571-logs" (OuterVolumeSpecName: "logs") pod "b87d64c2-8321-4eb3-a16d-ef89959dd571" (UID: "b87d64c2-8321-4eb3-a16d-ef89959dd571"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 13:17:38 crc kubenswrapper[4724]: I1002 13:17:38.337134 4724 reconciler_common.go:293] "Volume detached for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/b87d64c2-8321-4eb3-a16d-ef89959dd571-etc-nvme\") on node \"crc\" DevicePath \"\"" Oct 02 13:17:38 crc kubenswrapper[4724]: I1002 13:17:38.337149 4724 reconciler_common.go:293] "Volume detached for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/b87d64c2-8321-4eb3-a16d-ef89959dd571-lib-modules\") on node \"crc\" DevicePath \"\"" Oct 02 13:17:38 crc kubenswrapper[4724]: I1002 13:17:38.337158 4724 reconciler_common.go:293] "Volume detached for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/b87d64c2-8321-4eb3-a16d-ef89959dd571-dev\") on node \"crc\" DevicePath \"\"" Oct 02 13:17:38 crc kubenswrapper[4724]: I1002 13:17:38.337166 4724 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b87d64c2-8321-4eb3-a16d-ef89959dd571-logs\") on node \"crc\" DevicePath \"\"" Oct 02 13:17:38 crc kubenswrapper[4724]: I1002 13:17:38.337197 4724 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b87d64c2-8321-4eb3-a16d-ef89959dd571-etc-iscsi" (OuterVolumeSpecName: "etc-iscsi") pod "b87d64c2-8321-4eb3-a16d-ef89959dd571" (UID: "b87d64c2-8321-4eb3-a16d-ef89959dd571"). InnerVolumeSpecName "etc-iscsi". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 02 13:17:38 crc kubenswrapper[4724]: I1002 13:17:38.337292 4724 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b87d64c2-8321-4eb3-a16d-ef89959dd571-run" (OuterVolumeSpecName: "run") pod "b87d64c2-8321-4eb3-a16d-ef89959dd571" (UID: "b87d64c2-8321-4eb3-a16d-ef89959dd571"). InnerVolumeSpecName "run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 02 13:17:38 crc kubenswrapper[4724]: I1002 13:17:38.337431 4724 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b87d64c2-8321-4eb3-a16d-ef89959dd571-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "b87d64c2-8321-4eb3-a16d-ef89959dd571" (UID: "b87d64c2-8321-4eb3-a16d-ef89959dd571"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 13:17:38 crc kubenswrapper[4724]: I1002 13:17:38.337470 4724 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b87d64c2-8321-4eb3-a16d-ef89959dd571-var-locks-brick" (OuterVolumeSpecName: "var-locks-brick") pod "b87d64c2-8321-4eb3-a16d-ef89959dd571" (UID: "b87d64c2-8321-4eb3-a16d-ef89959dd571"). InnerVolumeSpecName "var-locks-brick". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 02 13:17:38 crc kubenswrapper[4724]: I1002 13:17:38.337440 4724 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b87d64c2-8321-4eb3-a16d-ef89959dd571-sys" (OuterVolumeSpecName: "sys") pod "b87d64c2-8321-4eb3-a16d-ef89959dd571" (UID: "b87d64c2-8321-4eb3-a16d-ef89959dd571"). InnerVolumeSpecName "sys". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 02 13:17:38 crc kubenswrapper[4724]: I1002 13:17:38.341706 4724 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage08-crc" (OuterVolumeSpecName: "glance-cache") pod "b87d64c2-8321-4eb3-a16d-ef89959dd571" (UID: "b87d64c2-8321-4eb3-a16d-ef89959dd571"). InnerVolumeSpecName "local-storage08-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 02 13:17:38 crc kubenswrapper[4724]: I1002 13:17:38.341768 4724 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage10-crc" (OuterVolumeSpecName: "glance") pod "b87d64c2-8321-4eb3-a16d-ef89959dd571" (UID: "b87d64c2-8321-4eb3-a16d-ef89959dd571"). InnerVolumeSpecName "local-storage10-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 02 13:17:38 crc kubenswrapper[4724]: I1002 13:17:38.342607 4724 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b87d64c2-8321-4eb3-a16d-ef89959dd571-kube-api-access-6tzpv" (OuterVolumeSpecName: "kube-api-access-6tzpv") pod "b87d64c2-8321-4eb3-a16d-ef89959dd571" (UID: "b87d64c2-8321-4eb3-a16d-ef89959dd571"). InnerVolumeSpecName "kube-api-access-6tzpv". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 13:17:38 crc kubenswrapper[4724]: I1002 13:17:38.343597 4724 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b87d64c2-8321-4eb3-a16d-ef89959dd571-scripts" (OuterVolumeSpecName: "scripts") pod "b87d64c2-8321-4eb3-a16d-ef89959dd571" (UID: "b87d64c2-8321-4eb3-a16d-ef89959dd571"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 13:17:38 crc kubenswrapper[4724]: I1002 13:17:38.375372 4724 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b87d64c2-8321-4eb3-a16d-ef89959dd571-config-data" (OuterVolumeSpecName: "config-data") pod "b87d64c2-8321-4eb3-a16d-ef89959dd571" (UID: "b87d64c2-8321-4eb3-a16d-ef89959dd571"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 13:17:38 crc kubenswrapper[4724]: I1002 13:17:38.438777 4724 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") on node \"crc\" " Oct 02 13:17:38 crc kubenswrapper[4724]: I1002 13:17:38.438841 4724 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b87d64c2-8321-4eb3-a16d-ef89959dd571-scripts\") on node \"crc\" DevicePath \"\"" Oct 02 13:17:38 crc kubenswrapper[4724]: I1002 13:17:38.438857 4724 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b87d64c2-8321-4eb3-a16d-ef89959dd571-config-data\") on node \"crc\" DevicePath \"\"" Oct 02 13:17:38 crc kubenswrapper[4724]: I1002 13:17:38.438871 4724 reconciler_common.go:293] "Volume detached for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/b87d64c2-8321-4eb3-a16d-ef89959dd571-var-locks-brick\") on node \"crc\" DevicePath \"\"" Oct 02 13:17:38 crc kubenswrapper[4724]: I1002 13:17:38.438888 4724 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6tzpv\" (UniqueName: \"kubernetes.io/projected/b87d64c2-8321-4eb3-a16d-ef89959dd571-kube-api-access-6tzpv\") on node \"crc\" DevicePath \"\"" Oct 02 13:17:38 crc kubenswrapper[4724]: I1002 13:17:38.438903 4724 reconciler_common.go:293] "Volume detached for volume \"run\" (UniqueName: \"kubernetes.io/host-path/b87d64c2-8321-4eb3-a16d-ef89959dd571-run\") on node \"crc\" DevicePath \"\"" Oct 02 13:17:38 crc kubenswrapper[4724]: I1002 13:17:38.438914 4724 reconciler_common.go:293] "Volume detached for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/b87d64c2-8321-4eb3-a16d-ef89959dd571-sys\") on node \"crc\" DevicePath \"\"" Oct 02 13:17:38 crc kubenswrapper[4724]: I1002 13:17:38.438925 4724 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/b87d64c2-8321-4eb3-a16d-ef89959dd571-httpd-run\") on node \"crc\" DevicePath \"\"" Oct 02 13:17:38 crc kubenswrapper[4724]: I1002 13:17:38.438944 4724 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") on node \"crc\" " Oct 02 13:17:38 crc kubenswrapper[4724]: I1002 13:17:38.438955 4724 reconciler_common.go:293] "Volume detached for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/b87d64c2-8321-4eb3-a16d-ef89959dd571-etc-iscsi\") on node \"crc\" DevicePath \"\"" Oct 02 13:17:38 crc kubenswrapper[4724]: I1002 13:17:38.453313 4724 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage10-crc" (UniqueName: "kubernetes.io/local-volume/local-storage10-crc") on node "crc" Oct 02 13:17:38 crc kubenswrapper[4724]: I1002 13:17:38.458583 4724 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage08-crc" (UniqueName: "kubernetes.io/local-volume/local-storage08-crc") on node "crc" Oct 02 13:17:38 crc kubenswrapper[4724]: I1002 13:17:38.540361 4724 reconciler_common.go:293] "Volume detached for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") on node \"crc\" DevicePath \"\"" Oct 02 13:17:38 crc kubenswrapper[4724]: I1002 13:17:38.540410 4724 reconciler_common.go:293] "Volume detached for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") on node \"crc\" DevicePath \"\"" Oct 02 13:17:38 crc kubenswrapper[4724]: I1002 13:17:38.835510 4724 generic.go:334] "Generic (PLEG): container finished" podID="b87d64c2-8321-4eb3-a16d-ef89959dd571" containerID="21b624104bfe0d8c6c14fe5800d6f5e9eac839826651e28026a5c40640d53781" exitCode=143 Oct 02 13:17:38 crc kubenswrapper[4724]: I1002 13:17:38.835566 4724 generic.go:334] "Generic (PLEG): container finished" podID="b87d64c2-8321-4eb3-a16d-ef89959dd571" containerID="eb0002c032839563c4ebef50b4dcfff6730c299d0ff0789c47ecfdab3e44d453" exitCode=143 Oct 02 13:17:38 crc kubenswrapper[4724]: I1002 13:17:38.836195 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-1" event={"ID":"b87d64c2-8321-4eb3-a16d-ef89959dd571","Type":"ContainerDied","Data":"21b624104bfe0d8c6c14fe5800d6f5e9eac839826651e28026a5c40640d53781"} Oct 02 13:17:38 crc kubenswrapper[4724]: I1002 13:17:38.836376 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-1" event={"ID":"b87d64c2-8321-4eb3-a16d-ef89959dd571","Type":"ContainerDied","Data":"eb0002c032839563c4ebef50b4dcfff6730c299d0ff0789c47ecfdab3e44d453"} Oct 02 13:17:38 crc kubenswrapper[4724]: I1002 13:17:38.836458 4724 scope.go:117] "RemoveContainer" containerID="21b624104bfe0d8c6c14fe5800d6f5e9eac839826651e28026a5c40640d53781" Oct 02 13:17:38 crc kubenswrapper[4724]: I1002 13:17:38.836471 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-1" event={"ID":"b87d64c2-8321-4eb3-a16d-ef89959dd571","Type":"ContainerDied","Data":"ec93ee67ad1a83512dcacaefa29d1b81e1a4aaf18183ac195878ccd6e437ffc0"} Oct 02 13:17:38 crc kubenswrapper[4724]: I1002 13:17:38.836219 4724 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-single-1" Oct 02 13:17:38 crc kubenswrapper[4724]: I1002 13:17:38.879172 4724 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-default-single-1"] Oct 02 13:17:38 crc kubenswrapper[4724]: I1002 13:17:38.880336 4724 scope.go:117] "RemoveContainer" containerID="eb0002c032839563c4ebef50b4dcfff6730c299d0ff0789c47ecfdab3e44d453" Oct 02 13:17:38 crc kubenswrapper[4724]: I1002 13:17:38.889474 4724 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/glance-default-single-1"] Oct 02 13:17:38 crc kubenswrapper[4724]: I1002 13:17:38.901904 4724 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/glance-default-single-1"] Oct 02 13:17:38 crc kubenswrapper[4724]: E1002 13:17:38.902334 4724 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b87d64c2-8321-4eb3-a16d-ef89959dd571" containerName="glance-log" Oct 02 13:17:38 crc kubenswrapper[4724]: I1002 13:17:38.902353 4724 state_mem.go:107] "Deleted CPUSet assignment" podUID="b87d64c2-8321-4eb3-a16d-ef89959dd571" containerName="glance-log" Oct 02 13:17:38 crc kubenswrapper[4724]: E1002 13:17:38.902961 4724 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b87d64c2-8321-4eb3-a16d-ef89959dd571" containerName="glance-httpd" Oct 02 13:17:38 crc kubenswrapper[4724]: I1002 13:17:38.902983 4724 state_mem.go:107] "Deleted CPUSet assignment" podUID="b87d64c2-8321-4eb3-a16d-ef89959dd571" containerName="glance-httpd" Oct 02 13:17:38 crc kubenswrapper[4724]: I1002 13:17:38.903160 4724 memory_manager.go:354] "RemoveStaleState removing state" podUID="b87d64c2-8321-4eb3-a16d-ef89959dd571" containerName="glance-httpd" Oct 02 13:17:38 crc kubenswrapper[4724]: I1002 13:17:38.903185 4724 memory_manager.go:354] "RemoveStaleState removing state" podUID="b87d64c2-8321-4eb3-a16d-ef89959dd571" containerName="glance-log" Oct 02 13:17:38 crc kubenswrapper[4724]: I1002 13:17:38.904163 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-single-1" Oct 02 13:17:38 crc kubenswrapper[4724]: I1002 13:17:38.907610 4724 scope.go:117] "RemoveContainer" containerID="21b624104bfe0d8c6c14fe5800d6f5e9eac839826651e28026a5c40640d53781" Oct 02 13:17:38 crc kubenswrapper[4724]: E1002 13:17:38.908059 4724 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"21b624104bfe0d8c6c14fe5800d6f5e9eac839826651e28026a5c40640d53781\": container with ID starting with 21b624104bfe0d8c6c14fe5800d6f5e9eac839826651e28026a5c40640d53781 not found: ID does not exist" containerID="21b624104bfe0d8c6c14fe5800d6f5e9eac839826651e28026a5c40640d53781" Oct 02 13:17:38 crc kubenswrapper[4724]: I1002 13:17:38.908124 4724 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"21b624104bfe0d8c6c14fe5800d6f5e9eac839826651e28026a5c40640d53781"} err="failed to get container status \"21b624104bfe0d8c6c14fe5800d6f5e9eac839826651e28026a5c40640d53781\": rpc error: code = NotFound desc = could not find container \"21b624104bfe0d8c6c14fe5800d6f5e9eac839826651e28026a5c40640d53781\": container with ID starting with 21b624104bfe0d8c6c14fe5800d6f5e9eac839826651e28026a5c40640d53781 not found: ID does not exist" Oct 02 13:17:38 crc kubenswrapper[4724]: I1002 13:17:38.908157 4724 scope.go:117] "RemoveContainer" containerID="eb0002c032839563c4ebef50b4dcfff6730c299d0ff0789c47ecfdab3e44d453" Oct 02 13:17:38 crc kubenswrapper[4724]: E1002 13:17:38.911649 4724 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"eb0002c032839563c4ebef50b4dcfff6730c299d0ff0789c47ecfdab3e44d453\": container with ID starting with eb0002c032839563c4ebef50b4dcfff6730c299d0ff0789c47ecfdab3e44d453 not found: ID does not exist" containerID="eb0002c032839563c4ebef50b4dcfff6730c299d0ff0789c47ecfdab3e44d453" Oct 02 13:17:38 crc kubenswrapper[4724]: I1002 13:17:38.911897 4724 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eb0002c032839563c4ebef50b4dcfff6730c299d0ff0789c47ecfdab3e44d453"} err="failed to get container status \"eb0002c032839563c4ebef50b4dcfff6730c299d0ff0789c47ecfdab3e44d453\": rpc error: code = NotFound desc = could not find container \"eb0002c032839563c4ebef50b4dcfff6730c299d0ff0789c47ecfdab3e44d453\": container with ID starting with eb0002c032839563c4ebef50b4dcfff6730c299d0ff0789c47ecfdab3e44d453 not found: ID does not exist" Oct 02 13:17:38 crc kubenswrapper[4724]: I1002 13:17:38.912011 4724 scope.go:117] "RemoveContainer" containerID="21b624104bfe0d8c6c14fe5800d6f5e9eac839826651e28026a5c40640d53781" Oct 02 13:17:38 crc kubenswrapper[4724]: I1002 13:17:38.913680 4724 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"21b624104bfe0d8c6c14fe5800d6f5e9eac839826651e28026a5c40640d53781"} err="failed to get container status \"21b624104bfe0d8c6c14fe5800d6f5e9eac839826651e28026a5c40640d53781\": rpc error: code = NotFound desc = could not find container \"21b624104bfe0d8c6c14fe5800d6f5e9eac839826651e28026a5c40640d53781\": container with ID starting with 21b624104bfe0d8c6c14fe5800d6f5e9eac839826651e28026a5c40640d53781 not found: ID does not exist" Oct 02 13:17:38 crc kubenswrapper[4724]: I1002 13:17:38.913722 4724 scope.go:117] "RemoveContainer" containerID="eb0002c032839563c4ebef50b4dcfff6730c299d0ff0789c47ecfdab3e44d453" Oct 02 13:17:38 crc kubenswrapper[4724]: I1002 13:17:38.914686 4724 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eb0002c032839563c4ebef50b4dcfff6730c299d0ff0789c47ecfdab3e44d453"} err="failed to get container status \"eb0002c032839563c4ebef50b4dcfff6730c299d0ff0789c47ecfdab3e44d453\": rpc error: code = NotFound desc = could not find container \"eb0002c032839563c4ebef50b4dcfff6730c299d0ff0789c47ecfdab3e44d453\": container with ID starting with eb0002c032839563c4ebef50b4dcfff6730c299d0ff0789c47ecfdab3e44d453 not found: ID does not exist" Oct 02 13:17:38 crc kubenswrapper[4724]: I1002 13:17:38.926673 4724 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-default-single-1"] Oct 02 13:17:39 crc kubenswrapper[4724]: I1002 13:17:39.060831 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-single-1\" (UID: \"9b9e3dca-3d82-4599-aa8c-536aa8bb75cb\") " pod="glance-kuttl-tests/glance-default-single-1" Oct 02 13:17:39 crc kubenswrapper[4724]: I1002 13:17:39.060896 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9b9e3dca-3d82-4599-aa8c-536aa8bb75cb-scripts\") pod \"glance-default-single-1\" (UID: \"9b9e3dca-3d82-4599-aa8c-536aa8bb75cb\") " pod="glance-kuttl-tests/glance-default-single-1" Oct 02 13:17:39 crc kubenswrapper[4724]: I1002 13:17:39.060923 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/9b9e3dca-3d82-4599-aa8c-536aa8bb75cb-dev\") pod \"glance-default-single-1\" (UID: \"9b9e3dca-3d82-4599-aa8c-536aa8bb75cb\") " pod="glance-kuttl-tests/glance-default-single-1" Oct 02 13:17:39 crc kubenswrapper[4724]: I1002 13:17:39.060966 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/9b9e3dca-3d82-4599-aa8c-536aa8bb75cb-sys\") pod \"glance-default-single-1\" (UID: \"9b9e3dca-3d82-4599-aa8c-536aa8bb75cb\") " pod="glance-kuttl-tests/glance-default-single-1" Oct 02 13:17:39 crc kubenswrapper[4724]: I1002 13:17:39.061005 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/9b9e3dca-3d82-4599-aa8c-536aa8bb75cb-etc-iscsi\") pod \"glance-default-single-1\" (UID: \"9b9e3dca-3d82-4599-aa8c-536aa8bb75cb\") " pod="glance-kuttl-tests/glance-default-single-1" Oct 02 13:17:39 crc kubenswrapper[4724]: I1002 13:17:39.061068 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/9b9e3dca-3d82-4599-aa8c-536aa8bb75cb-lib-modules\") pod \"glance-default-single-1\" (UID: \"9b9e3dca-3d82-4599-aa8c-536aa8bb75cb\") " pod="glance-kuttl-tests/glance-default-single-1" Oct 02 13:17:39 crc kubenswrapper[4724]: I1002 13:17:39.061103 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/9b9e3dca-3d82-4599-aa8c-536aa8bb75cb-run\") pod \"glance-default-single-1\" (UID: \"9b9e3dca-3d82-4599-aa8c-536aa8bb75cb\") " pod="glance-kuttl-tests/glance-default-single-1" Oct 02 13:17:39 crc kubenswrapper[4724]: I1002 13:17:39.061127 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-single-1\" (UID: \"9b9e3dca-3d82-4599-aa8c-536aa8bb75cb\") " pod="glance-kuttl-tests/glance-default-single-1" Oct 02 13:17:39 crc kubenswrapper[4724]: I1002 13:17:39.061164 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/9b9e3dca-3d82-4599-aa8c-536aa8bb75cb-var-locks-brick\") pod \"glance-default-single-1\" (UID: \"9b9e3dca-3d82-4599-aa8c-536aa8bb75cb\") " pod="glance-kuttl-tests/glance-default-single-1" Oct 02 13:17:39 crc kubenswrapper[4724]: I1002 13:17:39.061202 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s86mm\" (UniqueName: \"kubernetes.io/projected/9b9e3dca-3d82-4599-aa8c-536aa8bb75cb-kube-api-access-s86mm\") pod \"glance-default-single-1\" (UID: \"9b9e3dca-3d82-4599-aa8c-536aa8bb75cb\") " pod="glance-kuttl-tests/glance-default-single-1" Oct 02 13:17:39 crc kubenswrapper[4724]: I1002 13:17:39.061222 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/9b9e3dca-3d82-4599-aa8c-536aa8bb75cb-httpd-run\") pod \"glance-default-single-1\" (UID: \"9b9e3dca-3d82-4599-aa8c-536aa8bb75cb\") " pod="glance-kuttl-tests/glance-default-single-1" Oct 02 13:17:39 crc kubenswrapper[4724]: I1002 13:17:39.061245 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9b9e3dca-3d82-4599-aa8c-536aa8bb75cb-config-data\") pod \"glance-default-single-1\" (UID: \"9b9e3dca-3d82-4599-aa8c-536aa8bb75cb\") " pod="glance-kuttl-tests/glance-default-single-1" Oct 02 13:17:39 crc kubenswrapper[4724]: I1002 13:17:39.061268 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9b9e3dca-3d82-4599-aa8c-536aa8bb75cb-logs\") pod \"glance-default-single-1\" (UID: \"9b9e3dca-3d82-4599-aa8c-536aa8bb75cb\") " pod="glance-kuttl-tests/glance-default-single-1" Oct 02 13:17:39 crc kubenswrapper[4724]: I1002 13:17:39.061287 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/9b9e3dca-3d82-4599-aa8c-536aa8bb75cb-etc-nvme\") pod \"glance-default-single-1\" (UID: \"9b9e3dca-3d82-4599-aa8c-536aa8bb75cb\") " pod="glance-kuttl-tests/glance-default-single-1" Oct 02 13:17:39 crc kubenswrapper[4724]: I1002 13:17:39.162269 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/9b9e3dca-3d82-4599-aa8c-536aa8bb75cb-httpd-run\") pod \"glance-default-single-1\" (UID: \"9b9e3dca-3d82-4599-aa8c-536aa8bb75cb\") " pod="glance-kuttl-tests/glance-default-single-1" Oct 02 13:17:39 crc kubenswrapper[4724]: I1002 13:17:39.162328 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9b9e3dca-3d82-4599-aa8c-536aa8bb75cb-config-data\") pod \"glance-default-single-1\" (UID: \"9b9e3dca-3d82-4599-aa8c-536aa8bb75cb\") " pod="glance-kuttl-tests/glance-default-single-1" Oct 02 13:17:39 crc kubenswrapper[4724]: I1002 13:17:39.162350 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/9b9e3dca-3d82-4599-aa8c-536aa8bb75cb-etc-nvme\") pod \"glance-default-single-1\" (UID: \"9b9e3dca-3d82-4599-aa8c-536aa8bb75cb\") " pod="glance-kuttl-tests/glance-default-single-1" Oct 02 13:17:39 crc kubenswrapper[4724]: I1002 13:17:39.162369 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9b9e3dca-3d82-4599-aa8c-536aa8bb75cb-logs\") pod \"glance-default-single-1\" (UID: \"9b9e3dca-3d82-4599-aa8c-536aa8bb75cb\") " pod="glance-kuttl-tests/glance-default-single-1" Oct 02 13:17:39 crc kubenswrapper[4724]: I1002 13:17:39.162395 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-single-1\" (UID: \"9b9e3dca-3d82-4599-aa8c-536aa8bb75cb\") " pod="glance-kuttl-tests/glance-default-single-1" Oct 02 13:17:39 crc kubenswrapper[4724]: I1002 13:17:39.162412 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9b9e3dca-3d82-4599-aa8c-536aa8bb75cb-scripts\") pod \"glance-default-single-1\" (UID: \"9b9e3dca-3d82-4599-aa8c-536aa8bb75cb\") " pod="glance-kuttl-tests/glance-default-single-1" Oct 02 13:17:39 crc kubenswrapper[4724]: I1002 13:17:39.162426 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/9b9e3dca-3d82-4599-aa8c-536aa8bb75cb-dev\") pod \"glance-default-single-1\" (UID: \"9b9e3dca-3d82-4599-aa8c-536aa8bb75cb\") " pod="glance-kuttl-tests/glance-default-single-1" Oct 02 13:17:39 crc kubenswrapper[4724]: I1002 13:17:39.162451 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/9b9e3dca-3d82-4599-aa8c-536aa8bb75cb-sys\") pod \"glance-default-single-1\" (UID: \"9b9e3dca-3d82-4599-aa8c-536aa8bb75cb\") " pod="glance-kuttl-tests/glance-default-single-1" Oct 02 13:17:39 crc kubenswrapper[4724]: I1002 13:17:39.162469 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/9b9e3dca-3d82-4599-aa8c-536aa8bb75cb-etc-iscsi\") pod \"glance-default-single-1\" (UID: \"9b9e3dca-3d82-4599-aa8c-536aa8bb75cb\") " pod="glance-kuttl-tests/glance-default-single-1" Oct 02 13:17:39 crc kubenswrapper[4724]: I1002 13:17:39.162503 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/9b9e3dca-3d82-4599-aa8c-536aa8bb75cb-lib-modules\") pod \"glance-default-single-1\" (UID: \"9b9e3dca-3d82-4599-aa8c-536aa8bb75cb\") " pod="glance-kuttl-tests/glance-default-single-1" Oct 02 13:17:39 crc kubenswrapper[4724]: I1002 13:17:39.162525 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/9b9e3dca-3d82-4599-aa8c-536aa8bb75cb-run\") pod \"glance-default-single-1\" (UID: \"9b9e3dca-3d82-4599-aa8c-536aa8bb75cb\") " pod="glance-kuttl-tests/glance-default-single-1" Oct 02 13:17:39 crc kubenswrapper[4724]: I1002 13:17:39.162557 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-single-1\" (UID: \"9b9e3dca-3d82-4599-aa8c-536aa8bb75cb\") " pod="glance-kuttl-tests/glance-default-single-1" Oct 02 13:17:39 crc kubenswrapper[4724]: I1002 13:17:39.162585 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/9b9e3dca-3d82-4599-aa8c-536aa8bb75cb-var-locks-brick\") pod \"glance-default-single-1\" (UID: \"9b9e3dca-3d82-4599-aa8c-536aa8bb75cb\") " pod="glance-kuttl-tests/glance-default-single-1" Oct 02 13:17:39 crc kubenswrapper[4724]: I1002 13:17:39.162622 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s86mm\" (UniqueName: \"kubernetes.io/projected/9b9e3dca-3d82-4599-aa8c-536aa8bb75cb-kube-api-access-s86mm\") pod \"glance-default-single-1\" (UID: \"9b9e3dca-3d82-4599-aa8c-536aa8bb75cb\") " pod="glance-kuttl-tests/glance-default-single-1" Oct 02 13:17:39 crc kubenswrapper[4724]: I1002 13:17:39.163404 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/9b9e3dca-3d82-4599-aa8c-536aa8bb75cb-httpd-run\") pod \"glance-default-single-1\" (UID: \"9b9e3dca-3d82-4599-aa8c-536aa8bb75cb\") " pod="glance-kuttl-tests/glance-default-single-1" Oct 02 13:17:39 crc kubenswrapper[4724]: I1002 13:17:39.164056 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/9b9e3dca-3d82-4599-aa8c-536aa8bb75cb-etc-nvme\") pod \"glance-default-single-1\" (UID: \"9b9e3dca-3d82-4599-aa8c-536aa8bb75cb\") " pod="glance-kuttl-tests/glance-default-single-1" Oct 02 13:17:39 crc kubenswrapper[4724]: I1002 13:17:39.164137 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/9b9e3dca-3d82-4599-aa8c-536aa8bb75cb-sys\") pod \"glance-default-single-1\" (UID: \"9b9e3dca-3d82-4599-aa8c-536aa8bb75cb\") " pod="glance-kuttl-tests/glance-default-single-1" Oct 02 13:17:39 crc kubenswrapper[4724]: I1002 13:17:39.164171 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/9b9e3dca-3d82-4599-aa8c-536aa8bb75cb-etc-iscsi\") pod \"glance-default-single-1\" (UID: \"9b9e3dca-3d82-4599-aa8c-536aa8bb75cb\") " pod="glance-kuttl-tests/glance-default-single-1" Oct 02 13:17:39 crc kubenswrapper[4724]: I1002 13:17:39.164210 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/9b9e3dca-3d82-4599-aa8c-536aa8bb75cb-lib-modules\") pod \"glance-default-single-1\" (UID: \"9b9e3dca-3d82-4599-aa8c-536aa8bb75cb\") " pod="glance-kuttl-tests/glance-default-single-1" Oct 02 13:17:39 crc kubenswrapper[4724]: I1002 13:17:39.164242 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/9b9e3dca-3d82-4599-aa8c-536aa8bb75cb-run\") pod \"glance-default-single-1\" (UID: \"9b9e3dca-3d82-4599-aa8c-536aa8bb75cb\") " pod="glance-kuttl-tests/glance-default-single-1" Oct 02 13:17:39 crc kubenswrapper[4724]: I1002 13:17:39.164388 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/9b9e3dca-3d82-4599-aa8c-536aa8bb75cb-dev\") pod \"glance-default-single-1\" (UID: \"9b9e3dca-3d82-4599-aa8c-536aa8bb75cb\") " pod="glance-kuttl-tests/glance-default-single-1" Oct 02 13:17:39 crc kubenswrapper[4724]: I1002 13:17:39.164441 4724 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-single-1\" (UID: \"9b9e3dca-3d82-4599-aa8c-536aa8bb75cb\") device mount path \"/mnt/openstack/pv08\"" pod="glance-kuttl-tests/glance-default-single-1" Oct 02 13:17:39 crc kubenswrapper[4724]: I1002 13:17:39.164561 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/9b9e3dca-3d82-4599-aa8c-536aa8bb75cb-var-locks-brick\") pod \"glance-default-single-1\" (UID: \"9b9e3dca-3d82-4599-aa8c-536aa8bb75cb\") " pod="glance-kuttl-tests/glance-default-single-1" Oct 02 13:17:39 crc kubenswrapper[4724]: I1002 13:17:39.164513 4724 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-single-1\" (UID: \"9b9e3dca-3d82-4599-aa8c-536aa8bb75cb\") device mount path \"/mnt/openstack/pv10\"" pod="glance-kuttl-tests/glance-default-single-1" Oct 02 13:17:39 crc kubenswrapper[4724]: I1002 13:17:39.164876 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9b9e3dca-3d82-4599-aa8c-536aa8bb75cb-logs\") pod \"glance-default-single-1\" (UID: \"9b9e3dca-3d82-4599-aa8c-536aa8bb75cb\") " pod="glance-kuttl-tests/glance-default-single-1" Oct 02 13:17:39 crc kubenswrapper[4724]: I1002 13:17:39.170696 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9b9e3dca-3d82-4599-aa8c-536aa8bb75cb-config-data\") pod \"glance-default-single-1\" (UID: \"9b9e3dca-3d82-4599-aa8c-536aa8bb75cb\") " pod="glance-kuttl-tests/glance-default-single-1" Oct 02 13:17:39 crc kubenswrapper[4724]: I1002 13:17:39.183693 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-single-1\" (UID: \"9b9e3dca-3d82-4599-aa8c-536aa8bb75cb\") " pod="glance-kuttl-tests/glance-default-single-1" Oct 02 13:17:39 crc kubenswrapper[4724]: I1002 13:17:39.185451 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9b9e3dca-3d82-4599-aa8c-536aa8bb75cb-scripts\") pod \"glance-default-single-1\" (UID: \"9b9e3dca-3d82-4599-aa8c-536aa8bb75cb\") " pod="glance-kuttl-tests/glance-default-single-1" Oct 02 13:17:39 crc kubenswrapper[4724]: I1002 13:17:39.195525 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-single-1\" (UID: \"9b9e3dca-3d82-4599-aa8c-536aa8bb75cb\") " pod="glance-kuttl-tests/glance-default-single-1" Oct 02 13:17:39 crc kubenswrapper[4724]: I1002 13:17:39.204147 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s86mm\" (UniqueName: \"kubernetes.io/projected/9b9e3dca-3d82-4599-aa8c-536aa8bb75cb-kube-api-access-s86mm\") pod \"glance-default-single-1\" (UID: \"9b9e3dca-3d82-4599-aa8c-536aa8bb75cb\") " pod="glance-kuttl-tests/glance-default-single-1" Oct 02 13:17:39 crc kubenswrapper[4724]: I1002 13:17:39.226268 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-single-1" Oct 02 13:17:39 crc kubenswrapper[4724]: I1002 13:17:39.452884 4724 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-default-single-1"] Oct 02 13:17:39 crc kubenswrapper[4724]: I1002 13:17:39.849756 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-1" event={"ID":"9b9e3dca-3d82-4599-aa8c-536aa8bb75cb","Type":"ContainerStarted","Data":"5bff29cbf45dd297d838cb4eb017ed7acee29ae1497066d7132b3d3b90443663"} Oct 02 13:17:39 crc kubenswrapper[4724]: I1002 13:17:39.849961 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-1" event={"ID":"9b9e3dca-3d82-4599-aa8c-536aa8bb75cb","Type":"ContainerStarted","Data":"c1df8ce39a79beb7d20a7250b681031a8be1ee60ba0757db2c102975dc50d173"} Oct 02 13:17:40 crc kubenswrapper[4724]: I1002 13:17:40.324233 4724 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b87d64c2-8321-4eb3-a16d-ef89959dd571" path="/var/lib/kubelet/pods/b87d64c2-8321-4eb3-a16d-ef89959dd571/volumes" Oct 02 13:17:40 crc kubenswrapper[4724]: I1002 13:17:40.860274 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-1" event={"ID":"9b9e3dca-3d82-4599-aa8c-536aa8bb75cb","Type":"ContainerStarted","Data":"c929a03f5c282a9b52b41fb6075a9e750aff3165c96c8036a69473c1dc613787"} Oct 02 13:17:40 crc kubenswrapper[4724]: I1002 13:17:40.895693 4724 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="glance-kuttl-tests/glance-default-single-1" podStartSLOduration=2.895664092 podStartE2EDuration="2.895664092s" podCreationTimestamp="2025-10-02 13:17:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 13:17:40.888348372 +0000 UTC m=+1125.343107713" watchObservedRunningTime="2025-10-02 13:17:40.895664092 +0000 UTC m=+1125.350423223" Oct 02 13:17:46 crc kubenswrapper[4724]: I1002 13:17:46.514169 4724 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="glance-kuttl-tests/glance-default-single-0" Oct 02 13:17:46 crc kubenswrapper[4724]: I1002 13:17:46.514511 4724 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="glance-kuttl-tests/glance-default-single-0" Oct 02 13:17:46 crc kubenswrapper[4724]: I1002 13:17:46.541198 4724 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="glance-kuttl-tests/glance-default-single-0" Oct 02 13:17:46 crc kubenswrapper[4724]: I1002 13:17:46.552006 4724 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="glance-kuttl-tests/glance-default-single-0" Oct 02 13:17:46 crc kubenswrapper[4724]: I1002 13:17:46.913143 4724 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/glance-default-single-0" Oct 02 13:17:46 crc kubenswrapper[4724]: I1002 13:17:46.913259 4724 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/glance-default-single-0" Oct 02 13:17:49 crc kubenswrapper[4724]: I1002 13:17:49.141671 4724 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/glance-default-single-0" Oct 02 13:17:49 crc kubenswrapper[4724]: I1002 13:17:49.141854 4724 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 02 13:17:49 crc kubenswrapper[4724]: I1002 13:17:49.172283 4724 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/glance-default-single-0" Oct 02 13:17:49 crc kubenswrapper[4724]: I1002 13:17:49.228619 4724 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="glance-kuttl-tests/glance-default-single-1" Oct 02 13:17:49 crc kubenswrapper[4724]: I1002 13:17:49.228693 4724 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="glance-kuttl-tests/glance-default-single-1" Oct 02 13:17:49 crc kubenswrapper[4724]: I1002 13:17:49.276658 4724 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="glance-kuttl-tests/glance-default-single-1" Oct 02 13:17:49 crc kubenswrapper[4724]: I1002 13:17:49.278678 4724 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="glance-kuttl-tests/glance-default-single-1" Oct 02 13:17:49 crc kubenswrapper[4724]: I1002 13:17:49.935048 4724 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/glance-default-single-1" Oct 02 13:17:49 crc kubenswrapper[4724]: I1002 13:17:49.935116 4724 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/glance-default-single-1" Oct 02 13:17:53 crc kubenswrapper[4724]: I1002 13:17:53.013625 4724 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/glance-default-single-1" Oct 02 13:17:53 crc kubenswrapper[4724]: I1002 13:17:53.014715 4724 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 02 13:17:53 crc kubenswrapper[4724]: I1002 13:17:53.017477 4724 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/glance-default-single-1" Oct 02 13:17:53 crc kubenswrapper[4724]: I1002 13:17:53.081797 4724 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-default-single-0"] Oct 02 13:17:53 crc kubenswrapper[4724]: I1002 13:17:53.082069 4724 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-single-0" podUID="f7873fe0-aa1b-41d5-b31c-b5b6fec0bf0b" containerName="glance-log" containerID="cri-o://6624b3f6bb19e1abf0ab4a7c63c15dee9044e0598becc245d10c7f7da068ea98" gracePeriod=30 Oct 02 13:17:53 crc kubenswrapper[4724]: I1002 13:17:53.084407 4724 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-single-0" podUID="f7873fe0-aa1b-41d5-b31c-b5b6fec0bf0b" containerName="glance-httpd" containerID="cri-o://c7e82767fb4d4c2a485ec79187036790c22923814388094356dd4023140286d5" gracePeriod=30 Oct 02 13:17:53 crc kubenswrapper[4724]: I1002 13:17:53.976066 4724 generic.go:334] "Generic (PLEG): container finished" podID="f7873fe0-aa1b-41d5-b31c-b5b6fec0bf0b" containerID="6624b3f6bb19e1abf0ab4a7c63c15dee9044e0598becc245d10c7f7da068ea98" exitCode=143 Oct 02 13:17:53 crc kubenswrapper[4724]: I1002 13:17:53.976209 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-0" event={"ID":"f7873fe0-aa1b-41d5-b31c-b5b6fec0bf0b","Type":"ContainerDied","Data":"6624b3f6bb19e1abf0ab4a7c63c15dee9044e0598becc245d10c7f7da068ea98"} Oct 02 13:17:56 crc kubenswrapper[4724]: I1002 13:17:56.613353 4724 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-single-0" Oct 02 13:17:56 crc kubenswrapper[4724]: I1002 13:17:56.702261 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/f7873fe0-aa1b-41d5-b31c-b5b6fec0bf0b-sys\") pod \"f7873fe0-aa1b-41d5-b31c-b5b6fec0bf0b\" (UID: \"f7873fe0-aa1b-41d5-b31c-b5b6fec0bf0b\") " Oct 02 13:17:56 crc kubenswrapper[4724]: I1002 13:17:56.702350 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bpmwf\" (UniqueName: \"kubernetes.io/projected/f7873fe0-aa1b-41d5-b31c-b5b6fec0bf0b-kube-api-access-bpmwf\") pod \"f7873fe0-aa1b-41d5-b31c-b5b6fec0bf0b\" (UID: \"f7873fe0-aa1b-41d5-b31c-b5b6fec0bf0b\") " Oct 02 13:17:56 crc kubenswrapper[4724]: I1002 13:17:56.702361 4724 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f7873fe0-aa1b-41d5-b31c-b5b6fec0bf0b-sys" (OuterVolumeSpecName: "sys") pod "f7873fe0-aa1b-41d5-b31c-b5b6fec0bf0b" (UID: "f7873fe0-aa1b-41d5-b31c-b5b6fec0bf0b"). InnerVolumeSpecName "sys". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 02 13:17:56 crc kubenswrapper[4724]: I1002 13:17:56.702414 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f7873fe0-aa1b-41d5-b31c-b5b6fec0bf0b-logs\") pod \"f7873fe0-aa1b-41d5-b31c-b5b6fec0bf0b\" (UID: \"f7873fe0-aa1b-41d5-b31c-b5b6fec0bf0b\") " Oct 02 13:17:56 crc kubenswrapper[4724]: I1002 13:17:56.702443 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f7873fe0-aa1b-41d5-b31c-b5b6fec0bf0b-config-data\") pod \"f7873fe0-aa1b-41d5-b31c-b5b6fec0bf0b\" (UID: \"f7873fe0-aa1b-41d5-b31c-b5b6fec0bf0b\") " Oct 02 13:17:56 crc kubenswrapper[4724]: I1002 13:17:56.702498 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/f7873fe0-aa1b-41d5-b31c-b5b6fec0bf0b-run\") pod \"f7873fe0-aa1b-41d5-b31c-b5b6fec0bf0b\" (UID: \"f7873fe0-aa1b-41d5-b31c-b5b6fec0bf0b\") " Oct 02 13:17:56 crc kubenswrapper[4724]: I1002 13:17:56.702518 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f7873fe0-aa1b-41d5-b31c-b5b6fec0bf0b-scripts\") pod \"f7873fe0-aa1b-41d5-b31c-b5b6fec0bf0b\" (UID: \"f7873fe0-aa1b-41d5-b31c-b5b6fec0bf0b\") " Oct 02 13:17:56 crc kubenswrapper[4724]: I1002 13:17:56.702583 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/f7873fe0-aa1b-41d5-b31c-b5b6fec0bf0b-httpd-run\") pod \"f7873fe0-aa1b-41d5-b31c-b5b6fec0bf0b\" (UID: \"f7873fe0-aa1b-41d5-b31c-b5b6fec0bf0b\") " Oct 02 13:17:56 crc kubenswrapper[4724]: I1002 13:17:56.702622 4724 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f7873fe0-aa1b-41d5-b31c-b5b6fec0bf0b-run" (OuterVolumeSpecName: "run") pod "f7873fe0-aa1b-41d5-b31c-b5b6fec0bf0b" (UID: "f7873fe0-aa1b-41d5-b31c-b5b6fec0bf0b"). InnerVolumeSpecName "run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 02 13:17:56 crc kubenswrapper[4724]: I1002 13:17:56.703052 4724 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f7873fe0-aa1b-41d5-b31c-b5b6fec0bf0b-logs" (OuterVolumeSpecName: "logs") pod "f7873fe0-aa1b-41d5-b31c-b5b6fec0bf0b" (UID: "f7873fe0-aa1b-41d5-b31c-b5b6fec0bf0b"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 13:17:56 crc kubenswrapper[4724]: I1002 13:17:56.703087 4724 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f7873fe0-aa1b-41d5-b31c-b5b6fec0bf0b-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "f7873fe0-aa1b-41d5-b31c-b5b6fec0bf0b" (UID: "f7873fe0-aa1b-41d5-b31c-b5b6fec0bf0b"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 13:17:56 crc kubenswrapper[4724]: I1002 13:17:56.703431 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/f7873fe0-aa1b-41d5-b31c-b5b6fec0bf0b-lib-modules\") pod \"f7873fe0-aa1b-41d5-b31c-b5b6fec0bf0b\" (UID: \"f7873fe0-aa1b-41d5-b31c-b5b6fec0bf0b\") " Oct 02 13:17:56 crc kubenswrapper[4724]: I1002 13:17:56.703474 4724 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f7873fe0-aa1b-41d5-b31c-b5b6fec0bf0b-lib-modules" (OuterVolumeSpecName: "lib-modules") pod "f7873fe0-aa1b-41d5-b31c-b5b6fec0bf0b" (UID: "f7873fe0-aa1b-41d5-b31c-b5b6fec0bf0b"). InnerVolumeSpecName "lib-modules". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 02 13:17:56 crc kubenswrapper[4724]: I1002 13:17:56.703491 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/f7873fe0-aa1b-41d5-b31c-b5b6fec0bf0b-etc-iscsi\") pod \"f7873fe0-aa1b-41d5-b31c-b5b6fec0bf0b\" (UID: \"f7873fe0-aa1b-41d5-b31c-b5b6fec0bf0b\") " Oct 02 13:17:56 crc kubenswrapper[4724]: I1002 13:17:56.703519 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/f7873fe0-aa1b-41d5-b31c-b5b6fec0bf0b-dev\") pod \"f7873fe0-aa1b-41d5-b31c-b5b6fec0bf0b\" (UID: \"f7873fe0-aa1b-41d5-b31c-b5b6fec0bf0b\") " Oct 02 13:17:56 crc kubenswrapper[4724]: I1002 13:17:56.703568 4724 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f7873fe0-aa1b-41d5-b31c-b5b6fec0bf0b-etc-iscsi" (OuterVolumeSpecName: "etc-iscsi") pod "f7873fe0-aa1b-41d5-b31c-b5b6fec0bf0b" (UID: "f7873fe0-aa1b-41d5-b31c-b5b6fec0bf0b"). InnerVolumeSpecName "etc-iscsi". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 02 13:17:56 crc kubenswrapper[4724]: I1002 13:17:56.703589 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/f7873fe0-aa1b-41d5-b31c-b5b6fec0bf0b-etc-nvme\") pod \"f7873fe0-aa1b-41d5-b31c-b5b6fec0bf0b\" (UID: \"f7873fe0-aa1b-41d5-b31c-b5b6fec0bf0b\") " Oct 02 13:17:56 crc kubenswrapper[4724]: I1002 13:17:56.703615 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance-cache\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"f7873fe0-aa1b-41d5-b31c-b5b6fec0bf0b\" (UID: \"f7873fe0-aa1b-41d5-b31c-b5b6fec0bf0b\") " Oct 02 13:17:56 crc kubenswrapper[4724]: I1002 13:17:56.703673 4724 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f7873fe0-aa1b-41d5-b31c-b5b6fec0bf0b-etc-nvme" (OuterVolumeSpecName: "etc-nvme") pod "f7873fe0-aa1b-41d5-b31c-b5b6fec0bf0b" (UID: "f7873fe0-aa1b-41d5-b31c-b5b6fec0bf0b"). InnerVolumeSpecName "etc-nvme". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 02 13:17:56 crc kubenswrapper[4724]: I1002 13:17:56.703688 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/f7873fe0-aa1b-41d5-b31c-b5b6fec0bf0b-var-locks-brick\") pod \"f7873fe0-aa1b-41d5-b31c-b5b6fec0bf0b\" (UID: \"f7873fe0-aa1b-41d5-b31c-b5b6fec0bf0b\") " Oct 02 13:17:56 crc kubenswrapper[4724]: I1002 13:17:56.703689 4724 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f7873fe0-aa1b-41d5-b31c-b5b6fec0bf0b-dev" (OuterVolumeSpecName: "dev") pod "f7873fe0-aa1b-41d5-b31c-b5b6fec0bf0b" (UID: "f7873fe0-aa1b-41d5-b31c-b5b6fec0bf0b"). InnerVolumeSpecName "dev". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 02 13:17:56 crc kubenswrapper[4724]: I1002 13:17:56.703746 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"f7873fe0-aa1b-41d5-b31c-b5b6fec0bf0b\" (UID: \"f7873fe0-aa1b-41d5-b31c-b5b6fec0bf0b\") " Oct 02 13:17:56 crc kubenswrapper[4724]: I1002 13:17:56.703823 4724 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f7873fe0-aa1b-41d5-b31c-b5b6fec0bf0b-var-locks-brick" (OuterVolumeSpecName: "var-locks-brick") pod "f7873fe0-aa1b-41d5-b31c-b5b6fec0bf0b" (UID: "f7873fe0-aa1b-41d5-b31c-b5b6fec0bf0b"). InnerVolumeSpecName "var-locks-brick". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 02 13:17:56 crc kubenswrapper[4724]: I1002 13:17:56.704139 4724 reconciler_common.go:293] "Volume detached for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/f7873fe0-aa1b-41d5-b31c-b5b6fec0bf0b-sys\") on node \"crc\" DevicePath \"\"" Oct 02 13:17:56 crc kubenswrapper[4724]: I1002 13:17:56.704160 4724 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f7873fe0-aa1b-41d5-b31c-b5b6fec0bf0b-logs\") on node \"crc\" DevicePath \"\"" Oct 02 13:17:56 crc kubenswrapper[4724]: I1002 13:17:56.704171 4724 reconciler_common.go:293] "Volume detached for volume \"run\" (UniqueName: \"kubernetes.io/host-path/f7873fe0-aa1b-41d5-b31c-b5b6fec0bf0b-run\") on node \"crc\" DevicePath \"\"" Oct 02 13:17:56 crc kubenswrapper[4724]: I1002 13:17:56.704181 4724 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/f7873fe0-aa1b-41d5-b31c-b5b6fec0bf0b-httpd-run\") on node \"crc\" DevicePath \"\"" Oct 02 13:17:56 crc kubenswrapper[4724]: I1002 13:17:56.704206 4724 reconciler_common.go:293] "Volume detached for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/f7873fe0-aa1b-41d5-b31c-b5b6fec0bf0b-lib-modules\") on node \"crc\" DevicePath \"\"" Oct 02 13:17:56 crc kubenswrapper[4724]: I1002 13:17:56.704214 4724 reconciler_common.go:293] "Volume detached for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/f7873fe0-aa1b-41d5-b31c-b5b6fec0bf0b-etc-iscsi\") on node \"crc\" DevicePath \"\"" Oct 02 13:17:56 crc kubenswrapper[4724]: I1002 13:17:56.704222 4724 reconciler_common.go:293] "Volume detached for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/f7873fe0-aa1b-41d5-b31c-b5b6fec0bf0b-dev\") on node \"crc\" DevicePath \"\"" Oct 02 13:17:56 crc kubenswrapper[4724]: I1002 13:17:56.704231 4724 reconciler_common.go:293] "Volume detached for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/f7873fe0-aa1b-41d5-b31c-b5b6fec0bf0b-etc-nvme\") on node \"crc\" DevicePath \"\"" Oct 02 13:17:56 crc kubenswrapper[4724]: I1002 13:17:56.704239 4724 reconciler_common.go:293] "Volume detached for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/f7873fe0-aa1b-41d5-b31c-b5b6fec0bf0b-var-locks-brick\") on node \"crc\" DevicePath \"\"" Oct 02 13:17:56 crc kubenswrapper[4724]: I1002 13:17:56.709456 4724 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage09-crc" (OuterVolumeSpecName: "glance-cache") pod "f7873fe0-aa1b-41d5-b31c-b5b6fec0bf0b" (UID: "f7873fe0-aa1b-41d5-b31c-b5b6fec0bf0b"). InnerVolumeSpecName "local-storage09-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 02 13:17:56 crc kubenswrapper[4724]: I1002 13:17:56.711296 4724 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f7873fe0-aa1b-41d5-b31c-b5b6fec0bf0b-scripts" (OuterVolumeSpecName: "scripts") pod "f7873fe0-aa1b-41d5-b31c-b5b6fec0bf0b" (UID: "f7873fe0-aa1b-41d5-b31c-b5b6fec0bf0b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 13:17:56 crc kubenswrapper[4724]: I1002 13:17:56.711559 4724 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f7873fe0-aa1b-41d5-b31c-b5b6fec0bf0b-kube-api-access-bpmwf" (OuterVolumeSpecName: "kube-api-access-bpmwf") pod "f7873fe0-aa1b-41d5-b31c-b5b6fec0bf0b" (UID: "f7873fe0-aa1b-41d5-b31c-b5b6fec0bf0b"). InnerVolumeSpecName "kube-api-access-bpmwf". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 13:17:56 crc kubenswrapper[4724]: I1002 13:17:56.713488 4724 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage06-crc" (OuterVolumeSpecName: "glance") pod "f7873fe0-aa1b-41d5-b31c-b5b6fec0bf0b" (UID: "f7873fe0-aa1b-41d5-b31c-b5b6fec0bf0b"). InnerVolumeSpecName "local-storage06-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 02 13:17:56 crc kubenswrapper[4724]: I1002 13:17:56.749565 4724 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f7873fe0-aa1b-41d5-b31c-b5b6fec0bf0b-config-data" (OuterVolumeSpecName: "config-data") pod "f7873fe0-aa1b-41d5-b31c-b5b6fec0bf0b" (UID: "f7873fe0-aa1b-41d5-b31c-b5b6fec0bf0b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 13:17:56 crc kubenswrapper[4724]: I1002 13:17:56.805437 4724 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f7873fe0-aa1b-41d5-b31c-b5b6fec0bf0b-config-data\") on node \"crc\" DevicePath \"\"" Oct 02 13:17:56 crc kubenswrapper[4724]: I1002 13:17:56.805475 4724 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f7873fe0-aa1b-41d5-b31c-b5b6fec0bf0b-scripts\") on node \"crc\" DevicePath \"\"" Oct 02 13:17:56 crc kubenswrapper[4724]: I1002 13:17:56.805504 4724 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" " Oct 02 13:17:56 crc kubenswrapper[4724]: I1002 13:17:56.805522 4724 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" " Oct 02 13:17:56 crc kubenswrapper[4724]: I1002 13:17:56.805552 4724 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bpmwf\" (UniqueName: \"kubernetes.io/projected/f7873fe0-aa1b-41d5-b31c-b5b6fec0bf0b-kube-api-access-bpmwf\") on node \"crc\" DevicePath \"\"" Oct 02 13:17:56 crc kubenswrapper[4724]: I1002 13:17:56.819907 4724 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage06-crc" (UniqueName: "kubernetes.io/local-volume/local-storage06-crc") on node "crc" Oct 02 13:17:56 crc kubenswrapper[4724]: I1002 13:17:56.823523 4724 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage09-crc" (UniqueName: "kubernetes.io/local-volume/local-storage09-crc") on node "crc" Oct 02 13:17:56 crc kubenswrapper[4724]: I1002 13:17:56.907524 4724 reconciler_common.go:293] "Volume detached for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" DevicePath \"\"" Oct 02 13:17:56 crc kubenswrapper[4724]: I1002 13:17:56.907605 4724 reconciler_common.go:293] "Volume detached for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" DevicePath \"\"" Oct 02 13:17:57 crc kubenswrapper[4724]: I1002 13:17:57.003413 4724 generic.go:334] "Generic (PLEG): container finished" podID="f7873fe0-aa1b-41d5-b31c-b5b6fec0bf0b" containerID="c7e82767fb4d4c2a485ec79187036790c22923814388094356dd4023140286d5" exitCode=0 Oct 02 13:17:57 crc kubenswrapper[4724]: I1002 13:17:57.003475 4724 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-single-0" Oct 02 13:17:57 crc kubenswrapper[4724]: I1002 13:17:57.003479 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-0" event={"ID":"f7873fe0-aa1b-41d5-b31c-b5b6fec0bf0b","Type":"ContainerDied","Data":"c7e82767fb4d4c2a485ec79187036790c22923814388094356dd4023140286d5"} Oct 02 13:17:57 crc kubenswrapper[4724]: I1002 13:17:57.004029 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-0" event={"ID":"f7873fe0-aa1b-41d5-b31c-b5b6fec0bf0b","Type":"ContainerDied","Data":"b03b039f8c2e8fbca7ab1b1ba595223852e8ed2ce1652f5f28505a4fdc24ce82"} Oct 02 13:17:57 crc kubenswrapper[4724]: I1002 13:17:57.004134 4724 scope.go:117] "RemoveContainer" containerID="c7e82767fb4d4c2a485ec79187036790c22923814388094356dd4023140286d5" Oct 02 13:17:57 crc kubenswrapper[4724]: I1002 13:17:57.039804 4724 scope.go:117] "RemoveContainer" containerID="6624b3f6bb19e1abf0ab4a7c63c15dee9044e0598becc245d10c7f7da068ea98" Oct 02 13:17:57 crc kubenswrapper[4724]: I1002 13:17:57.048767 4724 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-default-single-0"] Oct 02 13:17:57 crc kubenswrapper[4724]: I1002 13:17:57.057024 4724 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/glance-default-single-0"] Oct 02 13:17:57 crc kubenswrapper[4724]: I1002 13:17:57.073676 4724 scope.go:117] "RemoveContainer" containerID="c7e82767fb4d4c2a485ec79187036790c22923814388094356dd4023140286d5" Oct 02 13:17:57 crc kubenswrapper[4724]: E1002 13:17:57.076914 4724 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c7e82767fb4d4c2a485ec79187036790c22923814388094356dd4023140286d5\": container with ID starting with c7e82767fb4d4c2a485ec79187036790c22923814388094356dd4023140286d5 not found: ID does not exist" containerID="c7e82767fb4d4c2a485ec79187036790c22923814388094356dd4023140286d5" Oct 02 13:17:57 crc kubenswrapper[4724]: I1002 13:17:57.076986 4724 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c7e82767fb4d4c2a485ec79187036790c22923814388094356dd4023140286d5"} err="failed to get container status \"c7e82767fb4d4c2a485ec79187036790c22923814388094356dd4023140286d5\": rpc error: code = NotFound desc = could not find container \"c7e82767fb4d4c2a485ec79187036790c22923814388094356dd4023140286d5\": container with ID starting with c7e82767fb4d4c2a485ec79187036790c22923814388094356dd4023140286d5 not found: ID does not exist" Oct 02 13:17:57 crc kubenswrapper[4724]: I1002 13:17:57.077032 4724 scope.go:117] "RemoveContainer" containerID="6624b3f6bb19e1abf0ab4a7c63c15dee9044e0598becc245d10c7f7da068ea98" Oct 02 13:17:57 crc kubenswrapper[4724]: E1002 13:17:57.077691 4724 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6624b3f6bb19e1abf0ab4a7c63c15dee9044e0598becc245d10c7f7da068ea98\": container with ID starting with 6624b3f6bb19e1abf0ab4a7c63c15dee9044e0598becc245d10c7f7da068ea98 not found: ID does not exist" containerID="6624b3f6bb19e1abf0ab4a7c63c15dee9044e0598becc245d10c7f7da068ea98" Oct 02 13:17:57 crc kubenswrapper[4724]: I1002 13:17:57.077752 4724 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6624b3f6bb19e1abf0ab4a7c63c15dee9044e0598becc245d10c7f7da068ea98"} err="failed to get container status \"6624b3f6bb19e1abf0ab4a7c63c15dee9044e0598becc245d10c7f7da068ea98\": rpc error: code = NotFound desc = could not find container \"6624b3f6bb19e1abf0ab4a7c63c15dee9044e0598becc245d10c7f7da068ea98\": container with ID starting with 6624b3f6bb19e1abf0ab4a7c63c15dee9044e0598becc245d10c7f7da068ea98 not found: ID does not exist" Oct 02 13:17:57 crc kubenswrapper[4724]: I1002 13:17:57.080272 4724 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/glance-default-single-0"] Oct 02 13:17:57 crc kubenswrapper[4724]: E1002 13:17:57.080832 4724 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f7873fe0-aa1b-41d5-b31c-b5b6fec0bf0b" containerName="glance-log" Oct 02 13:17:57 crc kubenswrapper[4724]: I1002 13:17:57.080934 4724 state_mem.go:107] "Deleted CPUSet assignment" podUID="f7873fe0-aa1b-41d5-b31c-b5b6fec0bf0b" containerName="glance-log" Oct 02 13:17:57 crc kubenswrapper[4724]: E1002 13:17:57.081011 4724 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f7873fe0-aa1b-41d5-b31c-b5b6fec0bf0b" containerName="glance-httpd" Oct 02 13:17:57 crc kubenswrapper[4724]: I1002 13:17:57.081526 4724 state_mem.go:107] "Deleted CPUSet assignment" podUID="f7873fe0-aa1b-41d5-b31c-b5b6fec0bf0b" containerName="glance-httpd" Oct 02 13:17:57 crc kubenswrapper[4724]: I1002 13:17:57.081837 4724 memory_manager.go:354] "RemoveStaleState removing state" podUID="f7873fe0-aa1b-41d5-b31c-b5b6fec0bf0b" containerName="glance-log" Oct 02 13:17:57 crc kubenswrapper[4724]: I1002 13:17:57.081944 4724 memory_manager.go:354] "RemoveStaleState removing state" podUID="f7873fe0-aa1b-41d5-b31c-b5b6fec0bf0b" containerName="glance-httpd" Oct 02 13:17:57 crc kubenswrapper[4724]: I1002 13:17:57.082895 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-single-0" Oct 02 13:17:57 crc kubenswrapper[4724]: I1002 13:17:57.101935 4724 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-default-single-0"] Oct 02 13:17:57 crc kubenswrapper[4724]: I1002 13:17:57.213759 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cc12278a-3ee7-4395-a462-163666b30769-scripts\") pod \"glance-default-single-0\" (UID: \"cc12278a-3ee7-4395-a462-163666b30769\") " pod="glance-kuttl-tests/glance-default-single-0" Oct 02 13:17:57 crc kubenswrapper[4724]: I1002 13:17:57.213856 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cc12278a-3ee7-4395-a462-163666b30769-config-data\") pod \"glance-default-single-0\" (UID: \"cc12278a-3ee7-4395-a462-163666b30769\") " pod="glance-kuttl-tests/glance-default-single-0" Oct 02 13:17:57 crc kubenswrapper[4724]: I1002 13:17:57.213970 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/cc12278a-3ee7-4395-a462-163666b30769-etc-nvme\") pod \"glance-default-single-0\" (UID: \"cc12278a-3ee7-4395-a462-163666b30769\") " pod="glance-kuttl-tests/glance-default-single-0" Oct 02 13:17:57 crc kubenswrapper[4724]: I1002 13:17:57.214014 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/cc12278a-3ee7-4395-a462-163666b30769-etc-iscsi\") pod \"glance-default-single-0\" (UID: \"cc12278a-3ee7-4395-a462-163666b30769\") " pod="glance-kuttl-tests/glance-default-single-0" Oct 02 13:17:57 crc kubenswrapper[4724]: I1002 13:17:57.214036 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/cc12278a-3ee7-4395-a462-163666b30769-sys\") pod \"glance-default-single-0\" (UID: \"cc12278a-3ee7-4395-a462-163666b30769\") " pod="glance-kuttl-tests/glance-default-single-0" Oct 02 13:17:57 crc kubenswrapper[4724]: I1002 13:17:57.214056 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/cc12278a-3ee7-4395-a462-163666b30769-httpd-run\") pod \"glance-default-single-0\" (UID: \"cc12278a-3ee7-4395-a462-163666b30769\") " pod="glance-kuttl-tests/glance-default-single-0" Oct 02 13:17:57 crc kubenswrapper[4724]: I1002 13:17:57.214289 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-single-0\" (UID: \"cc12278a-3ee7-4395-a462-163666b30769\") " pod="glance-kuttl-tests/glance-default-single-0" Oct 02 13:17:57 crc kubenswrapper[4724]: I1002 13:17:57.214407 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/cc12278a-3ee7-4395-a462-163666b30769-var-locks-brick\") pod \"glance-default-single-0\" (UID: \"cc12278a-3ee7-4395-a462-163666b30769\") " pod="glance-kuttl-tests/glance-default-single-0" Oct 02 13:17:57 crc kubenswrapper[4724]: I1002 13:17:57.214460 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cc12278a-3ee7-4395-a462-163666b30769-logs\") pod \"glance-default-single-0\" (UID: \"cc12278a-3ee7-4395-a462-163666b30769\") " pod="glance-kuttl-tests/glance-default-single-0" Oct 02 13:17:57 crc kubenswrapper[4724]: I1002 13:17:57.214711 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/cc12278a-3ee7-4395-a462-163666b30769-run\") pod \"glance-default-single-0\" (UID: \"cc12278a-3ee7-4395-a462-163666b30769\") " pod="glance-kuttl-tests/glance-default-single-0" Oct 02 13:17:57 crc kubenswrapper[4724]: I1002 13:17:57.214841 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d2znf\" (UniqueName: \"kubernetes.io/projected/cc12278a-3ee7-4395-a462-163666b30769-kube-api-access-d2znf\") pod \"glance-default-single-0\" (UID: \"cc12278a-3ee7-4395-a462-163666b30769\") " pod="glance-kuttl-tests/glance-default-single-0" Oct 02 13:17:57 crc kubenswrapper[4724]: I1002 13:17:57.214928 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/cc12278a-3ee7-4395-a462-163666b30769-dev\") pod \"glance-default-single-0\" (UID: \"cc12278a-3ee7-4395-a462-163666b30769\") " pod="glance-kuttl-tests/glance-default-single-0" Oct 02 13:17:57 crc kubenswrapper[4724]: I1002 13:17:57.214977 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/cc12278a-3ee7-4395-a462-163666b30769-lib-modules\") pod \"glance-default-single-0\" (UID: \"cc12278a-3ee7-4395-a462-163666b30769\") " pod="glance-kuttl-tests/glance-default-single-0" Oct 02 13:17:57 crc kubenswrapper[4724]: I1002 13:17:57.215010 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-single-0\" (UID: \"cc12278a-3ee7-4395-a462-163666b30769\") " pod="glance-kuttl-tests/glance-default-single-0" Oct 02 13:17:57 crc kubenswrapper[4724]: I1002 13:17:57.316775 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/cc12278a-3ee7-4395-a462-163666b30769-etc-nvme\") pod \"glance-default-single-0\" (UID: \"cc12278a-3ee7-4395-a462-163666b30769\") " pod="glance-kuttl-tests/glance-default-single-0" Oct 02 13:17:57 crc kubenswrapper[4724]: I1002 13:17:57.316863 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/cc12278a-3ee7-4395-a462-163666b30769-etc-iscsi\") pod \"glance-default-single-0\" (UID: \"cc12278a-3ee7-4395-a462-163666b30769\") " pod="glance-kuttl-tests/glance-default-single-0" Oct 02 13:17:57 crc kubenswrapper[4724]: I1002 13:17:57.316894 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/cc12278a-3ee7-4395-a462-163666b30769-sys\") pod \"glance-default-single-0\" (UID: \"cc12278a-3ee7-4395-a462-163666b30769\") " pod="glance-kuttl-tests/glance-default-single-0" Oct 02 13:17:57 crc kubenswrapper[4724]: I1002 13:17:57.316914 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/cc12278a-3ee7-4395-a462-163666b30769-httpd-run\") pod \"glance-default-single-0\" (UID: \"cc12278a-3ee7-4395-a462-163666b30769\") " pod="glance-kuttl-tests/glance-default-single-0" Oct 02 13:17:57 crc kubenswrapper[4724]: I1002 13:17:57.316918 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/cc12278a-3ee7-4395-a462-163666b30769-etc-nvme\") pod \"glance-default-single-0\" (UID: \"cc12278a-3ee7-4395-a462-163666b30769\") " pod="glance-kuttl-tests/glance-default-single-0" Oct 02 13:17:57 crc kubenswrapper[4724]: I1002 13:17:57.316968 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-single-0\" (UID: \"cc12278a-3ee7-4395-a462-163666b30769\") " pod="glance-kuttl-tests/glance-default-single-0" Oct 02 13:17:57 crc kubenswrapper[4724]: I1002 13:17:57.317053 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/cc12278a-3ee7-4395-a462-163666b30769-etc-iscsi\") pod \"glance-default-single-0\" (UID: \"cc12278a-3ee7-4395-a462-163666b30769\") " pod="glance-kuttl-tests/glance-default-single-0" Oct 02 13:17:57 crc kubenswrapper[4724]: I1002 13:17:57.317099 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/cc12278a-3ee7-4395-a462-163666b30769-sys\") pod \"glance-default-single-0\" (UID: \"cc12278a-3ee7-4395-a462-163666b30769\") " pod="glance-kuttl-tests/glance-default-single-0" Oct 02 13:17:57 crc kubenswrapper[4724]: I1002 13:17:57.317125 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/cc12278a-3ee7-4395-a462-163666b30769-var-locks-brick\") pod \"glance-default-single-0\" (UID: \"cc12278a-3ee7-4395-a462-163666b30769\") " pod="glance-kuttl-tests/glance-default-single-0" Oct 02 13:17:57 crc kubenswrapper[4724]: I1002 13:17:57.317152 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cc12278a-3ee7-4395-a462-163666b30769-logs\") pod \"glance-default-single-0\" (UID: \"cc12278a-3ee7-4395-a462-163666b30769\") " pod="glance-kuttl-tests/glance-default-single-0" Oct 02 13:17:57 crc kubenswrapper[4724]: I1002 13:17:57.317234 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/cc12278a-3ee7-4395-a462-163666b30769-var-locks-brick\") pod \"glance-default-single-0\" (UID: \"cc12278a-3ee7-4395-a462-163666b30769\") " pod="glance-kuttl-tests/glance-default-single-0" Oct 02 13:17:57 crc kubenswrapper[4724]: I1002 13:17:57.317261 4724 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-single-0\" (UID: \"cc12278a-3ee7-4395-a462-163666b30769\") device mount path \"/mnt/openstack/pv09\"" pod="glance-kuttl-tests/glance-default-single-0" Oct 02 13:17:57 crc kubenswrapper[4724]: I1002 13:17:57.317374 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/cc12278a-3ee7-4395-a462-163666b30769-run\") pod \"glance-default-single-0\" (UID: \"cc12278a-3ee7-4395-a462-163666b30769\") " pod="glance-kuttl-tests/glance-default-single-0" Oct 02 13:17:57 crc kubenswrapper[4724]: I1002 13:17:57.317409 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d2znf\" (UniqueName: \"kubernetes.io/projected/cc12278a-3ee7-4395-a462-163666b30769-kube-api-access-d2znf\") pod \"glance-default-single-0\" (UID: \"cc12278a-3ee7-4395-a462-163666b30769\") " pod="glance-kuttl-tests/glance-default-single-0" Oct 02 13:17:57 crc kubenswrapper[4724]: I1002 13:17:57.317443 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/cc12278a-3ee7-4395-a462-163666b30769-dev\") pod \"glance-default-single-0\" (UID: \"cc12278a-3ee7-4395-a462-163666b30769\") " pod="glance-kuttl-tests/glance-default-single-0" Oct 02 13:17:57 crc kubenswrapper[4724]: I1002 13:17:57.317468 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-single-0\" (UID: \"cc12278a-3ee7-4395-a462-163666b30769\") " pod="glance-kuttl-tests/glance-default-single-0" Oct 02 13:17:57 crc kubenswrapper[4724]: I1002 13:17:57.317485 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/cc12278a-3ee7-4395-a462-163666b30769-lib-modules\") pod \"glance-default-single-0\" (UID: \"cc12278a-3ee7-4395-a462-163666b30769\") " pod="glance-kuttl-tests/glance-default-single-0" Oct 02 13:17:57 crc kubenswrapper[4724]: I1002 13:17:57.317512 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cc12278a-3ee7-4395-a462-163666b30769-scripts\") pod \"glance-default-single-0\" (UID: \"cc12278a-3ee7-4395-a462-163666b30769\") " pod="glance-kuttl-tests/glance-default-single-0" Oct 02 13:17:57 crc kubenswrapper[4724]: I1002 13:17:57.317566 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cc12278a-3ee7-4395-a462-163666b30769-config-data\") pod \"glance-default-single-0\" (UID: \"cc12278a-3ee7-4395-a462-163666b30769\") " pod="glance-kuttl-tests/glance-default-single-0" Oct 02 13:17:57 crc kubenswrapper[4724]: I1002 13:17:57.317604 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/cc12278a-3ee7-4395-a462-163666b30769-httpd-run\") pod \"glance-default-single-0\" (UID: \"cc12278a-3ee7-4395-a462-163666b30769\") " pod="glance-kuttl-tests/glance-default-single-0" Oct 02 13:17:57 crc kubenswrapper[4724]: I1002 13:17:57.317703 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/cc12278a-3ee7-4395-a462-163666b30769-dev\") pod \"glance-default-single-0\" (UID: \"cc12278a-3ee7-4395-a462-163666b30769\") " pod="glance-kuttl-tests/glance-default-single-0" Oct 02 13:17:57 crc kubenswrapper[4724]: I1002 13:17:57.317818 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/cc12278a-3ee7-4395-a462-163666b30769-run\") pod \"glance-default-single-0\" (UID: \"cc12278a-3ee7-4395-a462-163666b30769\") " pod="glance-kuttl-tests/glance-default-single-0" Oct 02 13:17:57 crc kubenswrapper[4724]: I1002 13:17:57.317847 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/cc12278a-3ee7-4395-a462-163666b30769-lib-modules\") pod \"glance-default-single-0\" (UID: \"cc12278a-3ee7-4395-a462-163666b30769\") " pod="glance-kuttl-tests/glance-default-single-0" Oct 02 13:17:57 crc kubenswrapper[4724]: I1002 13:17:57.317821 4724 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-single-0\" (UID: \"cc12278a-3ee7-4395-a462-163666b30769\") device mount path \"/mnt/openstack/pv06\"" pod="glance-kuttl-tests/glance-default-single-0" Oct 02 13:17:57 crc kubenswrapper[4724]: I1002 13:17:57.318288 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cc12278a-3ee7-4395-a462-163666b30769-logs\") pod \"glance-default-single-0\" (UID: \"cc12278a-3ee7-4395-a462-163666b30769\") " pod="glance-kuttl-tests/glance-default-single-0" Oct 02 13:17:57 crc kubenswrapper[4724]: I1002 13:17:57.326151 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cc12278a-3ee7-4395-a462-163666b30769-scripts\") pod \"glance-default-single-0\" (UID: \"cc12278a-3ee7-4395-a462-163666b30769\") " pod="glance-kuttl-tests/glance-default-single-0" Oct 02 13:17:57 crc kubenswrapper[4724]: I1002 13:17:57.328645 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cc12278a-3ee7-4395-a462-163666b30769-config-data\") pod \"glance-default-single-0\" (UID: \"cc12278a-3ee7-4395-a462-163666b30769\") " pod="glance-kuttl-tests/glance-default-single-0" Oct 02 13:17:57 crc kubenswrapper[4724]: I1002 13:17:57.340002 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d2znf\" (UniqueName: \"kubernetes.io/projected/cc12278a-3ee7-4395-a462-163666b30769-kube-api-access-d2znf\") pod \"glance-default-single-0\" (UID: \"cc12278a-3ee7-4395-a462-163666b30769\") " pod="glance-kuttl-tests/glance-default-single-0" Oct 02 13:17:57 crc kubenswrapper[4724]: I1002 13:17:57.340960 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-single-0\" (UID: \"cc12278a-3ee7-4395-a462-163666b30769\") " pod="glance-kuttl-tests/glance-default-single-0" Oct 02 13:17:57 crc kubenswrapper[4724]: I1002 13:17:57.341556 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-single-0\" (UID: \"cc12278a-3ee7-4395-a462-163666b30769\") " pod="glance-kuttl-tests/glance-default-single-0" Oct 02 13:17:57 crc kubenswrapper[4724]: I1002 13:17:57.405426 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-single-0" Oct 02 13:17:57 crc kubenswrapper[4724]: I1002 13:17:57.913705 4724 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-default-single-0"] Oct 02 13:17:58 crc kubenswrapper[4724]: I1002 13:17:58.015037 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-0" event={"ID":"cc12278a-3ee7-4395-a462-163666b30769","Type":"ContainerStarted","Data":"6ac1806db3050a9a67fbceefd913e380edf98e8c411640100daf36db12086f29"} Oct 02 13:17:58 crc kubenswrapper[4724]: I1002 13:17:58.330459 4724 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f7873fe0-aa1b-41d5-b31c-b5b6fec0bf0b" path="/var/lib/kubelet/pods/f7873fe0-aa1b-41d5-b31c-b5b6fec0bf0b/volumes" Oct 02 13:17:59 crc kubenswrapper[4724]: I1002 13:17:59.035993 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-0" event={"ID":"cc12278a-3ee7-4395-a462-163666b30769","Type":"ContainerStarted","Data":"2dd30b1b025105749a2c05184515332aafd2853313c576ba111850fdc2367956"} Oct 02 13:17:59 crc kubenswrapper[4724]: I1002 13:17:59.036603 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-0" event={"ID":"cc12278a-3ee7-4395-a462-163666b30769","Type":"ContainerStarted","Data":"a4f1f84713982de5c8cc2f8dbf766c6a1fc5fc4a318619eb3b1d0424c9b459d5"} Oct 02 13:17:59 crc kubenswrapper[4724]: I1002 13:17:59.159184 4724 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="glance-kuttl-tests/glance-default-single-0" podStartSLOduration=2.159153325 podStartE2EDuration="2.159153325s" podCreationTimestamp="2025-10-02 13:17:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 13:17:59.15510882 +0000 UTC m=+1143.609867961" watchObservedRunningTime="2025-10-02 13:17:59.159153325 +0000 UTC m=+1143.613912446" Oct 02 13:18:04 crc kubenswrapper[4724]: I1002 13:18:04.734892 4724 patch_prober.go:28] interesting pod/machine-config-daemon-74k4t container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 02 13:18:04 crc kubenswrapper[4724]: I1002 13:18:04.736764 4724 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-74k4t" podUID="f6090eaa-c182-4788-950c-16352c271233" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 02 13:18:07 crc kubenswrapper[4724]: I1002 13:18:07.405779 4724 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="glance-kuttl-tests/glance-default-single-0" Oct 02 13:18:07 crc kubenswrapper[4724]: I1002 13:18:07.405863 4724 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="glance-kuttl-tests/glance-default-single-0" Oct 02 13:18:07 crc kubenswrapper[4724]: I1002 13:18:07.446087 4724 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="glance-kuttl-tests/glance-default-single-0" Oct 02 13:18:07 crc kubenswrapper[4724]: I1002 13:18:07.458432 4724 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="glance-kuttl-tests/glance-default-single-0" Oct 02 13:18:08 crc kubenswrapper[4724]: I1002 13:18:08.117370 4724 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/glance-default-single-0" Oct 02 13:18:08 crc kubenswrapper[4724]: I1002 13:18:08.117827 4724 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/glance-default-single-0" Oct 02 13:18:10 crc kubenswrapper[4724]: I1002 13:18:10.133859 4724 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 02 13:18:10 crc kubenswrapper[4724]: I1002 13:18:10.134284 4724 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 02 13:18:10 crc kubenswrapper[4724]: I1002 13:18:10.287804 4724 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/glance-default-single-0" Oct 02 13:18:10 crc kubenswrapper[4724]: I1002 13:18:10.291598 4724 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/glance-default-single-0" Oct 02 13:18:34 crc kubenswrapper[4724]: I1002 13:18:34.528268 4724 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-db-sync-ntrlf"] Oct 02 13:18:34 crc kubenswrapper[4724]: I1002 13:18:34.535019 4724 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/glance-db-sync-ntrlf"] Oct 02 13:18:34 crc kubenswrapper[4724]: I1002 13:18:34.606855 4724 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/glancef957-account-delete-wwll8"] Oct 02 13:18:34 crc kubenswrapper[4724]: I1002 13:18:34.608001 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glancef957-account-delete-wwll8" Oct 02 13:18:34 crc kubenswrapper[4724]: I1002 13:18:34.623062 4724 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-default-single-0"] Oct 02 13:18:34 crc kubenswrapper[4724]: I1002 13:18:34.623310 4724 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-single-0" podUID="cc12278a-3ee7-4395-a462-163666b30769" containerName="glance-log" containerID="cri-o://a4f1f84713982de5c8cc2f8dbf766c6a1fc5fc4a318619eb3b1d0424c9b459d5" gracePeriod=30 Oct 02 13:18:34 crc kubenswrapper[4724]: I1002 13:18:34.623434 4724 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-single-0" podUID="cc12278a-3ee7-4395-a462-163666b30769" containerName="glance-httpd" containerID="cri-o://2dd30b1b025105749a2c05184515332aafd2853313c576ba111850fdc2367956" gracePeriod=30 Oct 02 13:18:34 crc kubenswrapper[4724]: I1002 13:18:34.625821 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nt75r\" (UniqueName: \"kubernetes.io/projected/5d1d50fa-2407-4b6d-82e8-22ed9f5a02ee-kube-api-access-nt75r\") pod \"glancef957-account-delete-wwll8\" (UID: \"5d1d50fa-2407-4b6d-82e8-22ed9f5a02ee\") " pod="glance-kuttl-tests/glancef957-account-delete-wwll8" Oct 02 13:18:34 crc kubenswrapper[4724]: I1002 13:18:34.642503 4724 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glancef957-account-delete-wwll8"] Oct 02 13:18:34 crc kubenswrapper[4724]: I1002 13:18:34.649640 4724 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-default-single-1"] Oct 02 13:18:34 crc kubenswrapper[4724]: I1002 13:18:34.650031 4724 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-single-1" podUID="9b9e3dca-3d82-4599-aa8c-536aa8bb75cb" containerName="glance-httpd" containerID="cri-o://c929a03f5c282a9b52b41fb6075a9e750aff3165c96c8036a69473c1dc613787" gracePeriod=30 Oct 02 13:18:34 crc kubenswrapper[4724]: I1002 13:18:34.650349 4724 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-single-1" podUID="9b9e3dca-3d82-4599-aa8c-536aa8bb75cb" containerName="glance-log" containerID="cri-o://5bff29cbf45dd297d838cb4eb017ed7acee29ae1497066d7132b3d3b90443663" gracePeriod=30 Oct 02 13:18:34 crc kubenswrapper[4724]: I1002 13:18:34.727138 4724 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/openstackclient"] Oct 02 13:18:34 crc kubenswrapper[4724]: I1002 13:18:34.727401 4724 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/openstackclient" podUID="7de12d0c-65ee-44be-80d9-3b3256e71ada" containerName="openstackclient" containerID="cri-o://1430f81098af0253dde0a989008abafa704a1e00272ea537638071609d1242fe" gracePeriod=30 Oct 02 13:18:34 crc kubenswrapper[4724]: I1002 13:18:34.727516 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nt75r\" (UniqueName: \"kubernetes.io/projected/5d1d50fa-2407-4b6d-82e8-22ed9f5a02ee-kube-api-access-nt75r\") pod \"glancef957-account-delete-wwll8\" (UID: \"5d1d50fa-2407-4b6d-82e8-22ed9f5a02ee\") " pod="glance-kuttl-tests/glancef957-account-delete-wwll8" Oct 02 13:18:34 crc kubenswrapper[4724]: I1002 13:18:34.737642 4724 patch_prober.go:28] interesting pod/machine-config-daemon-74k4t container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 02 13:18:34 crc kubenswrapper[4724]: I1002 13:18:34.737704 4724 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-74k4t" podUID="f6090eaa-c182-4788-950c-16352c271233" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 02 13:18:34 crc kubenswrapper[4724]: I1002 13:18:34.757776 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nt75r\" (UniqueName: \"kubernetes.io/projected/5d1d50fa-2407-4b6d-82e8-22ed9f5a02ee-kube-api-access-nt75r\") pod \"glancef957-account-delete-wwll8\" (UID: \"5d1d50fa-2407-4b6d-82e8-22ed9f5a02ee\") " pod="glance-kuttl-tests/glancef957-account-delete-wwll8" Oct 02 13:18:34 crc kubenswrapper[4724]: I1002 13:18:34.930003 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glancef957-account-delete-wwll8" Oct 02 13:18:35 crc kubenswrapper[4724]: I1002 13:18:35.167881 4724 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/openstackclient" Oct 02 13:18:35 crc kubenswrapper[4724]: I1002 13:18:35.235072 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/7de12d0c-65ee-44be-80d9-3b3256e71ada-openstack-config\") pod \"7de12d0c-65ee-44be-80d9-3b3256e71ada\" (UID: \"7de12d0c-65ee-44be-80d9-3b3256e71ada\") " Oct 02 13:18:35 crc kubenswrapper[4724]: I1002 13:18:35.235147 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/7de12d0c-65ee-44be-80d9-3b3256e71ada-openstack-config-secret\") pod \"7de12d0c-65ee-44be-80d9-3b3256e71ada\" (UID: \"7de12d0c-65ee-44be-80d9-3b3256e71ada\") " Oct 02 13:18:35 crc kubenswrapper[4724]: I1002 13:18:35.235177 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4cwvq\" (UniqueName: \"kubernetes.io/projected/7de12d0c-65ee-44be-80d9-3b3256e71ada-kube-api-access-4cwvq\") pod \"7de12d0c-65ee-44be-80d9-3b3256e71ada\" (UID: \"7de12d0c-65ee-44be-80d9-3b3256e71ada\") " Oct 02 13:18:35 crc kubenswrapper[4724]: I1002 13:18:35.235209 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-scripts\" (UniqueName: \"kubernetes.io/configmap/7de12d0c-65ee-44be-80d9-3b3256e71ada-openstack-scripts\") pod \"7de12d0c-65ee-44be-80d9-3b3256e71ada\" (UID: \"7de12d0c-65ee-44be-80d9-3b3256e71ada\") " Oct 02 13:18:35 crc kubenswrapper[4724]: I1002 13:18:35.236551 4724 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7de12d0c-65ee-44be-80d9-3b3256e71ada-openstack-scripts" (OuterVolumeSpecName: "openstack-scripts") pod "7de12d0c-65ee-44be-80d9-3b3256e71ada" (UID: "7de12d0c-65ee-44be-80d9-3b3256e71ada"). InnerVolumeSpecName "openstack-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 13:18:35 crc kubenswrapper[4724]: I1002 13:18:35.241012 4724 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7de12d0c-65ee-44be-80d9-3b3256e71ada-kube-api-access-4cwvq" (OuterVolumeSpecName: "kube-api-access-4cwvq") pod "7de12d0c-65ee-44be-80d9-3b3256e71ada" (UID: "7de12d0c-65ee-44be-80d9-3b3256e71ada"). InnerVolumeSpecName "kube-api-access-4cwvq". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 13:18:35 crc kubenswrapper[4724]: I1002 13:18:35.253056 4724 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7de12d0c-65ee-44be-80d9-3b3256e71ada-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "7de12d0c-65ee-44be-80d9-3b3256e71ada" (UID: "7de12d0c-65ee-44be-80d9-3b3256e71ada"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 13:18:35 crc kubenswrapper[4724]: I1002 13:18:35.262588 4724 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7de12d0c-65ee-44be-80d9-3b3256e71ada-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "7de12d0c-65ee-44be-80d9-3b3256e71ada" (UID: "7de12d0c-65ee-44be-80d9-3b3256e71ada"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 13:18:35 crc kubenswrapper[4724]: I1002 13:18:35.336246 4724 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/7de12d0c-65ee-44be-80d9-3b3256e71ada-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Oct 02 13:18:35 crc kubenswrapper[4724]: I1002 13:18:35.336279 4724 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4cwvq\" (UniqueName: \"kubernetes.io/projected/7de12d0c-65ee-44be-80d9-3b3256e71ada-kube-api-access-4cwvq\") on node \"crc\" DevicePath \"\"" Oct 02 13:18:35 crc kubenswrapper[4724]: I1002 13:18:35.336289 4724 reconciler_common.go:293] "Volume detached for volume \"openstack-scripts\" (UniqueName: \"kubernetes.io/configmap/7de12d0c-65ee-44be-80d9-3b3256e71ada-openstack-scripts\") on node \"crc\" DevicePath \"\"" Oct 02 13:18:35 crc kubenswrapper[4724]: I1002 13:18:35.336300 4724 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/7de12d0c-65ee-44be-80d9-3b3256e71ada-openstack-config\") on node \"crc\" DevicePath \"\"" Oct 02 13:18:35 crc kubenswrapper[4724]: I1002 13:18:35.345014 4724 generic.go:334] "Generic (PLEG): container finished" podID="7de12d0c-65ee-44be-80d9-3b3256e71ada" containerID="1430f81098af0253dde0a989008abafa704a1e00272ea537638071609d1242fe" exitCode=143 Oct 02 13:18:35 crc kubenswrapper[4724]: I1002 13:18:35.345077 4724 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/openstackclient" Oct 02 13:18:35 crc kubenswrapper[4724]: I1002 13:18:35.345095 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/openstackclient" event={"ID":"7de12d0c-65ee-44be-80d9-3b3256e71ada","Type":"ContainerDied","Data":"1430f81098af0253dde0a989008abafa704a1e00272ea537638071609d1242fe"} Oct 02 13:18:35 crc kubenswrapper[4724]: I1002 13:18:35.345459 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/openstackclient" event={"ID":"7de12d0c-65ee-44be-80d9-3b3256e71ada","Type":"ContainerDied","Data":"0c73e82513b58ff3ca8dd77284b135b0b2f42657f81bcc6fd597987a25b46716"} Oct 02 13:18:35 crc kubenswrapper[4724]: I1002 13:18:35.345496 4724 scope.go:117] "RemoveContainer" containerID="1430f81098af0253dde0a989008abafa704a1e00272ea537638071609d1242fe" Oct 02 13:18:35 crc kubenswrapper[4724]: I1002 13:18:35.348076 4724 generic.go:334] "Generic (PLEG): container finished" podID="9b9e3dca-3d82-4599-aa8c-536aa8bb75cb" containerID="5bff29cbf45dd297d838cb4eb017ed7acee29ae1497066d7132b3d3b90443663" exitCode=143 Oct 02 13:18:35 crc kubenswrapper[4724]: I1002 13:18:35.348139 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-1" event={"ID":"9b9e3dca-3d82-4599-aa8c-536aa8bb75cb","Type":"ContainerDied","Data":"5bff29cbf45dd297d838cb4eb017ed7acee29ae1497066d7132b3d3b90443663"} Oct 02 13:18:35 crc kubenswrapper[4724]: I1002 13:18:35.351450 4724 generic.go:334] "Generic (PLEG): container finished" podID="cc12278a-3ee7-4395-a462-163666b30769" containerID="a4f1f84713982de5c8cc2f8dbf766c6a1fc5fc4a318619eb3b1d0424c9b459d5" exitCode=143 Oct 02 13:18:35 crc kubenswrapper[4724]: I1002 13:18:35.351529 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-0" event={"ID":"cc12278a-3ee7-4395-a462-163666b30769","Type":"ContainerDied","Data":"a4f1f84713982de5c8cc2f8dbf766c6a1fc5fc4a318619eb3b1d0424c9b459d5"} Oct 02 13:18:35 crc kubenswrapper[4724]: I1002 13:18:35.374300 4724 scope.go:117] "RemoveContainer" containerID="1430f81098af0253dde0a989008abafa704a1e00272ea537638071609d1242fe" Oct 02 13:18:35 crc kubenswrapper[4724]: E1002 13:18:35.377574 4724 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1430f81098af0253dde0a989008abafa704a1e00272ea537638071609d1242fe\": container with ID starting with 1430f81098af0253dde0a989008abafa704a1e00272ea537638071609d1242fe not found: ID does not exist" containerID="1430f81098af0253dde0a989008abafa704a1e00272ea537638071609d1242fe" Oct 02 13:18:35 crc kubenswrapper[4724]: I1002 13:18:35.377685 4724 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1430f81098af0253dde0a989008abafa704a1e00272ea537638071609d1242fe"} err="failed to get container status \"1430f81098af0253dde0a989008abafa704a1e00272ea537638071609d1242fe\": rpc error: code = NotFound desc = could not find container \"1430f81098af0253dde0a989008abafa704a1e00272ea537638071609d1242fe\": container with ID starting with 1430f81098af0253dde0a989008abafa704a1e00272ea537638071609d1242fe not found: ID does not exist" Oct 02 13:18:35 crc kubenswrapper[4724]: I1002 13:18:35.387027 4724 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/openstackclient"] Oct 02 13:18:35 crc kubenswrapper[4724]: I1002 13:18:35.394646 4724 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/openstackclient"] Oct 02 13:18:35 crc kubenswrapper[4724]: I1002 13:18:35.418674 4724 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glancef957-account-delete-wwll8"] Oct 02 13:18:36 crc kubenswrapper[4724]: I1002 13:18:36.323136 4724 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7a00d229-c6ff-4167-82e9-9a4804a008f6" path="/var/lib/kubelet/pods/7a00d229-c6ff-4167-82e9-9a4804a008f6/volumes" Oct 02 13:18:36 crc kubenswrapper[4724]: I1002 13:18:36.324506 4724 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7de12d0c-65ee-44be-80d9-3b3256e71ada" path="/var/lib/kubelet/pods/7de12d0c-65ee-44be-80d9-3b3256e71ada/volumes" Oct 02 13:18:36 crc kubenswrapper[4724]: I1002 13:18:36.363422 4724 generic.go:334] "Generic (PLEG): container finished" podID="5d1d50fa-2407-4b6d-82e8-22ed9f5a02ee" containerID="17ee2c8a8311d96e0001a4526d8cb38f53b7e0ee5a566195baf35e75544be84b" exitCode=0 Oct 02 13:18:36 crc kubenswrapper[4724]: I1002 13:18:36.363495 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glancef957-account-delete-wwll8" event={"ID":"5d1d50fa-2407-4b6d-82e8-22ed9f5a02ee","Type":"ContainerDied","Data":"17ee2c8a8311d96e0001a4526d8cb38f53b7e0ee5a566195baf35e75544be84b"} Oct 02 13:18:36 crc kubenswrapper[4724]: I1002 13:18:36.363617 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glancef957-account-delete-wwll8" event={"ID":"5d1d50fa-2407-4b6d-82e8-22ed9f5a02ee","Type":"ContainerStarted","Data":"683e69aae05640676b2c3c7899e9ca4782ccc5665afc62482d9bf67d51b943c4"} Oct 02 13:18:37 crc kubenswrapper[4724]: I1002 13:18:37.699394 4724 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glancef957-account-delete-wwll8" Oct 02 13:18:37 crc kubenswrapper[4724]: I1002 13:18:37.775452 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nt75r\" (UniqueName: \"kubernetes.io/projected/5d1d50fa-2407-4b6d-82e8-22ed9f5a02ee-kube-api-access-nt75r\") pod \"5d1d50fa-2407-4b6d-82e8-22ed9f5a02ee\" (UID: \"5d1d50fa-2407-4b6d-82e8-22ed9f5a02ee\") " Oct 02 13:18:37 crc kubenswrapper[4724]: I1002 13:18:37.784404 4724 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5d1d50fa-2407-4b6d-82e8-22ed9f5a02ee-kube-api-access-nt75r" (OuterVolumeSpecName: "kube-api-access-nt75r") pod "5d1d50fa-2407-4b6d-82e8-22ed9f5a02ee" (UID: "5d1d50fa-2407-4b6d-82e8-22ed9f5a02ee"). InnerVolumeSpecName "kube-api-access-nt75r". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 13:18:37 crc kubenswrapper[4724]: I1002 13:18:37.878295 4724 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nt75r\" (UniqueName: \"kubernetes.io/projected/5d1d50fa-2407-4b6d-82e8-22ed9f5a02ee-kube-api-access-nt75r\") on node \"crc\" DevicePath \"\"" Oct 02 13:18:38 crc kubenswrapper[4724]: I1002 13:18:38.179360 4724 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-single-1" Oct 02 13:18:38 crc kubenswrapper[4724]: I1002 13:18:38.183023 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/9b9e3dca-3d82-4599-aa8c-536aa8bb75cb-sys\") pod \"9b9e3dca-3d82-4599-aa8c-536aa8bb75cb\" (UID: \"9b9e3dca-3d82-4599-aa8c-536aa8bb75cb\") " Oct 02 13:18:38 crc kubenswrapper[4724]: I1002 13:18:38.183082 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/9b9e3dca-3d82-4599-aa8c-536aa8bb75cb-dev\") pod \"9b9e3dca-3d82-4599-aa8c-536aa8bb75cb\" (UID: \"9b9e3dca-3d82-4599-aa8c-536aa8bb75cb\") " Oct 02 13:18:38 crc kubenswrapper[4724]: I1002 13:18:38.183126 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9b9e3dca-3d82-4599-aa8c-536aa8bb75cb-logs\") pod \"9b9e3dca-3d82-4599-aa8c-536aa8bb75cb\" (UID: \"9b9e3dca-3d82-4599-aa8c-536aa8bb75cb\") " Oct 02 13:18:38 crc kubenswrapper[4724]: I1002 13:18:38.183149 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"9b9e3dca-3d82-4599-aa8c-536aa8bb75cb\" (UID: \"9b9e3dca-3d82-4599-aa8c-536aa8bb75cb\") " Oct 02 13:18:38 crc kubenswrapper[4724]: I1002 13:18:38.183177 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/9b9e3dca-3d82-4599-aa8c-536aa8bb75cb-httpd-run\") pod \"9b9e3dca-3d82-4599-aa8c-536aa8bb75cb\" (UID: \"9b9e3dca-3d82-4599-aa8c-536aa8bb75cb\") " Oct 02 13:18:38 crc kubenswrapper[4724]: I1002 13:18:38.183201 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/9b9e3dca-3d82-4599-aa8c-536aa8bb75cb-etc-nvme\") pod \"9b9e3dca-3d82-4599-aa8c-536aa8bb75cb\" (UID: \"9b9e3dca-3d82-4599-aa8c-536aa8bb75cb\") " Oct 02 13:18:38 crc kubenswrapper[4724]: I1002 13:18:38.183229 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9b9e3dca-3d82-4599-aa8c-536aa8bb75cb-scripts\") pod \"9b9e3dca-3d82-4599-aa8c-536aa8bb75cb\" (UID: \"9b9e3dca-3d82-4599-aa8c-536aa8bb75cb\") " Oct 02 13:18:38 crc kubenswrapper[4724]: I1002 13:18:38.183281 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9b9e3dca-3d82-4599-aa8c-536aa8bb75cb-config-data\") pod \"9b9e3dca-3d82-4599-aa8c-536aa8bb75cb\" (UID: \"9b9e3dca-3d82-4599-aa8c-536aa8bb75cb\") " Oct 02 13:18:38 crc kubenswrapper[4724]: I1002 13:18:38.183302 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/9b9e3dca-3d82-4599-aa8c-536aa8bb75cb-run\") pod \"9b9e3dca-3d82-4599-aa8c-536aa8bb75cb\" (UID: \"9b9e3dca-3d82-4599-aa8c-536aa8bb75cb\") " Oct 02 13:18:38 crc kubenswrapper[4724]: I1002 13:18:38.183317 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/9b9e3dca-3d82-4599-aa8c-536aa8bb75cb-lib-modules\") pod \"9b9e3dca-3d82-4599-aa8c-536aa8bb75cb\" (UID: \"9b9e3dca-3d82-4599-aa8c-536aa8bb75cb\") " Oct 02 13:18:38 crc kubenswrapper[4724]: I1002 13:18:38.183355 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/9b9e3dca-3d82-4599-aa8c-536aa8bb75cb-etc-iscsi\") pod \"9b9e3dca-3d82-4599-aa8c-536aa8bb75cb\" (UID: \"9b9e3dca-3d82-4599-aa8c-536aa8bb75cb\") " Oct 02 13:18:38 crc kubenswrapper[4724]: I1002 13:18:38.183385 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/9b9e3dca-3d82-4599-aa8c-536aa8bb75cb-var-locks-brick\") pod \"9b9e3dca-3d82-4599-aa8c-536aa8bb75cb\" (UID: \"9b9e3dca-3d82-4599-aa8c-536aa8bb75cb\") " Oct 02 13:18:38 crc kubenswrapper[4724]: I1002 13:18:38.183407 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s86mm\" (UniqueName: \"kubernetes.io/projected/9b9e3dca-3d82-4599-aa8c-536aa8bb75cb-kube-api-access-s86mm\") pod \"9b9e3dca-3d82-4599-aa8c-536aa8bb75cb\" (UID: \"9b9e3dca-3d82-4599-aa8c-536aa8bb75cb\") " Oct 02 13:18:38 crc kubenswrapper[4724]: I1002 13:18:38.183428 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance-cache\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"9b9e3dca-3d82-4599-aa8c-536aa8bb75cb\" (UID: \"9b9e3dca-3d82-4599-aa8c-536aa8bb75cb\") " Oct 02 13:18:38 crc kubenswrapper[4724]: I1002 13:18:38.183166 4724 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/9b9e3dca-3d82-4599-aa8c-536aa8bb75cb-sys" (OuterVolumeSpecName: "sys") pod "9b9e3dca-3d82-4599-aa8c-536aa8bb75cb" (UID: "9b9e3dca-3d82-4599-aa8c-536aa8bb75cb"). InnerVolumeSpecName "sys". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 02 13:18:38 crc kubenswrapper[4724]: I1002 13:18:38.183202 4724 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/9b9e3dca-3d82-4599-aa8c-536aa8bb75cb-dev" (OuterVolumeSpecName: "dev") pod "9b9e3dca-3d82-4599-aa8c-536aa8bb75cb" (UID: "9b9e3dca-3d82-4599-aa8c-536aa8bb75cb"). InnerVolumeSpecName "dev". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 02 13:18:38 crc kubenswrapper[4724]: I1002 13:18:38.183556 4724 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/9b9e3dca-3d82-4599-aa8c-536aa8bb75cb-run" (OuterVolumeSpecName: "run") pod "9b9e3dca-3d82-4599-aa8c-536aa8bb75cb" (UID: "9b9e3dca-3d82-4599-aa8c-536aa8bb75cb"). InnerVolumeSpecName "run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 02 13:18:38 crc kubenswrapper[4724]: I1002 13:18:38.183628 4724 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/9b9e3dca-3d82-4599-aa8c-536aa8bb75cb-etc-nvme" (OuterVolumeSpecName: "etc-nvme") pod "9b9e3dca-3d82-4599-aa8c-536aa8bb75cb" (UID: "9b9e3dca-3d82-4599-aa8c-536aa8bb75cb"). InnerVolumeSpecName "etc-nvme". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 02 13:18:38 crc kubenswrapper[4724]: I1002 13:18:38.183628 4724 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9b9e3dca-3d82-4599-aa8c-536aa8bb75cb-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "9b9e3dca-3d82-4599-aa8c-536aa8bb75cb" (UID: "9b9e3dca-3d82-4599-aa8c-536aa8bb75cb"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 13:18:38 crc kubenswrapper[4724]: I1002 13:18:38.183739 4724 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9b9e3dca-3d82-4599-aa8c-536aa8bb75cb-logs" (OuterVolumeSpecName: "logs") pod "9b9e3dca-3d82-4599-aa8c-536aa8bb75cb" (UID: "9b9e3dca-3d82-4599-aa8c-536aa8bb75cb"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 13:18:38 crc kubenswrapper[4724]: I1002 13:18:38.184212 4724 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/9b9e3dca-3d82-4599-aa8c-536aa8bb75cb-etc-iscsi" (OuterVolumeSpecName: "etc-iscsi") pod "9b9e3dca-3d82-4599-aa8c-536aa8bb75cb" (UID: "9b9e3dca-3d82-4599-aa8c-536aa8bb75cb"). InnerVolumeSpecName "etc-iscsi". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 02 13:18:38 crc kubenswrapper[4724]: I1002 13:18:38.184306 4724 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/9b9e3dca-3d82-4599-aa8c-536aa8bb75cb-lib-modules" (OuterVolumeSpecName: "lib-modules") pod "9b9e3dca-3d82-4599-aa8c-536aa8bb75cb" (UID: "9b9e3dca-3d82-4599-aa8c-536aa8bb75cb"). InnerVolumeSpecName "lib-modules". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 02 13:18:38 crc kubenswrapper[4724]: I1002 13:18:38.184275 4724 reconciler_common.go:293] "Volume detached for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/9b9e3dca-3d82-4599-aa8c-536aa8bb75cb-dev\") on node \"crc\" DevicePath \"\"" Oct 02 13:18:38 crc kubenswrapper[4724]: I1002 13:18:38.184370 4724 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9b9e3dca-3d82-4599-aa8c-536aa8bb75cb-logs\") on node \"crc\" DevicePath \"\"" Oct 02 13:18:38 crc kubenswrapper[4724]: I1002 13:18:38.184383 4724 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/9b9e3dca-3d82-4599-aa8c-536aa8bb75cb-httpd-run\") on node \"crc\" DevicePath \"\"" Oct 02 13:18:38 crc kubenswrapper[4724]: I1002 13:18:38.184397 4724 reconciler_common.go:293] "Volume detached for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/9b9e3dca-3d82-4599-aa8c-536aa8bb75cb-etc-nvme\") on node \"crc\" DevicePath \"\"" Oct 02 13:18:38 crc kubenswrapper[4724]: I1002 13:18:38.184433 4724 reconciler_common.go:293] "Volume detached for volume \"run\" (UniqueName: \"kubernetes.io/host-path/9b9e3dca-3d82-4599-aa8c-536aa8bb75cb-run\") on node \"crc\" DevicePath \"\"" Oct 02 13:18:38 crc kubenswrapper[4724]: I1002 13:18:38.184446 4724 reconciler_common.go:293] "Volume detached for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/9b9e3dca-3d82-4599-aa8c-536aa8bb75cb-sys\") on node \"crc\" DevicePath \"\"" Oct 02 13:18:38 crc kubenswrapper[4724]: I1002 13:18:38.184677 4724 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/9b9e3dca-3d82-4599-aa8c-536aa8bb75cb-var-locks-brick" (OuterVolumeSpecName: "var-locks-brick") pod "9b9e3dca-3d82-4599-aa8c-536aa8bb75cb" (UID: "9b9e3dca-3d82-4599-aa8c-536aa8bb75cb"). InnerVolumeSpecName "var-locks-brick". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 02 13:18:38 crc kubenswrapper[4724]: I1002 13:18:38.186378 4724 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage10-crc" (OuterVolumeSpecName: "glance") pod "9b9e3dca-3d82-4599-aa8c-536aa8bb75cb" (UID: "9b9e3dca-3d82-4599-aa8c-536aa8bb75cb"). InnerVolumeSpecName "local-storage10-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 02 13:18:38 crc kubenswrapper[4724]: I1002 13:18:38.187949 4724 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9b9e3dca-3d82-4599-aa8c-536aa8bb75cb-scripts" (OuterVolumeSpecName: "scripts") pod "9b9e3dca-3d82-4599-aa8c-536aa8bb75cb" (UID: "9b9e3dca-3d82-4599-aa8c-536aa8bb75cb"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 13:18:38 crc kubenswrapper[4724]: I1002 13:18:38.187948 4724 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage08-crc" (OuterVolumeSpecName: "glance-cache") pod "9b9e3dca-3d82-4599-aa8c-536aa8bb75cb" (UID: "9b9e3dca-3d82-4599-aa8c-536aa8bb75cb"). InnerVolumeSpecName "local-storage08-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 02 13:18:38 crc kubenswrapper[4724]: I1002 13:18:38.197768 4724 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9b9e3dca-3d82-4599-aa8c-536aa8bb75cb-kube-api-access-s86mm" (OuterVolumeSpecName: "kube-api-access-s86mm") pod "9b9e3dca-3d82-4599-aa8c-536aa8bb75cb" (UID: "9b9e3dca-3d82-4599-aa8c-536aa8bb75cb"). InnerVolumeSpecName "kube-api-access-s86mm". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 13:18:38 crc kubenswrapper[4724]: I1002 13:18:38.236653 4724 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9b9e3dca-3d82-4599-aa8c-536aa8bb75cb-config-data" (OuterVolumeSpecName: "config-data") pod "9b9e3dca-3d82-4599-aa8c-536aa8bb75cb" (UID: "9b9e3dca-3d82-4599-aa8c-536aa8bb75cb"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 13:18:38 crc kubenswrapper[4724]: I1002 13:18:38.286070 4724 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") on node \"crc\" " Oct 02 13:18:38 crc kubenswrapper[4724]: I1002 13:18:38.286117 4724 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9b9e3dca-3d82-4599-aa8c-536aa8bb75cb-scripts\") on node \"crc\" DevicePath \"\"" Oct 02 13:18:38 crc kubenswrapper[4724]: I1002 13:18:38.286127 4724 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9b9e3dca-3d82-4599-aa8c-536aa8bb75cb-config-data\") on node \"crc\" DevicePath \"\"" Oct 02 13:18:38 crc kubenswrapper[4724]: I1002 13:18:38.286137 4724 reconciler_common.go:293] "Volume detached for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/9b9e3dca-3d82-4599-aa8c-536aa8bb75cb-lib-modules\") on node \"crc\" DevicePath \"\"" Oct 02 13:18:38 crc kubenswrapper[4724]: I1002 13:18:38.286145 4724 reconciler_common.go:293] "Volume detached for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/9b9e3dca-3d82-4599-aa8c-536aa8bb75cb-etc-iscsi\") on node \"crc\" DevicePath \"\"" Oct 02 13:18:38 crc kubenswrapper[4724]: I1002 13:18:38.286154 4724 reconciler_common.go:293] "Volume detached for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/9b9e3dca-3d82-4599-aa8c-536aa8bb75cb-var-locks-brick\") on node \"crc\" DevicePath \"\"" Oct 02 13:18:38 crc kubenswrapper[4724]: I1002 13:18:38.286169 4724 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s86mm\" (UniqueName: \"kubernetes.io/projected/9b9e3dca-3d82-4599-aa8c-536aa8bb75cb-kube-api-access-s86mm\") on node \"crc\" DevicePath \"\"" Oct 02 13:18:38 crc kubenswrapper[4724]: I1002 13:18:38.286191 4724 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") on node \"crc\" " Oct 02 13:18:38 crc kubenswrapper[4724]: I1002 13:18:38.301107 4724 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage10-crc" (UniqueName: "kubernetes.io/local-volume/local-storage10-crc") on node "crc" Oct 02 13:18:38 crc kubenswrapper[4724]: I1002 13:18:38.301463 4724 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage08-crc" (UniqueName: "kubernetes.io/local-volume/local-storage08-crc") on node "crc" Oct 02 13:18:38 crc kubenswrapper[4724]: I1002 13:18:38.380520 4724 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glancef957-account-delete-wwll8" Oct 02 13:18:38 crc kubenswrapper[4724]: I1002 13:18:38.380518 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glancef957-account-delete-wwll8" event={"ID":"5d1d50fa-2407-4b6d-82e8-22ed9f5a02ee","Type":"ContainerDied","Data":"683e69aae05640676b2c3c7899e9ca4782ccc5665afc62482d9bf67d51b943c4"} Oct 02 13:18:38 crc kubenswrapper[4724]: I1002 13:18:38.380950 4724 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="683e69aae05640676b2c3c7899e9ca4782ccc5665afc62482d9bf67d51b943c4" Oct 02 13:18:38 crc kubenswrapper[4724]: I1002 13:18:38.385492 4724 generic.go:334] "Generic (PLEG): container finished" podID="9b9e3dca-3d82-4599-aa8c-536aa8bb75cb" containerID="c929a03f5c282a9b52b41fb6075a9e750aff3165c96c8036a69473c1dc613787" exitCode=0 Oct 02 13:18:38 crc kubenswrapper[4724]: I1002 13:18:38.385550 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-1" event={"ID":"9b9e3dca-3d82-4599-aa8c-536aa8bb75cb","Type":"ContainerDied","Data":"c929a03f5c282a9b52b41fb6075a9e750aff3165c96c8036a69473c1dc613787"} Oct 02 13:18:38 crc kubenswrapper[4724]: I1002 13:18:38.385573 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-1" event={"ID":"9b9e3dca-3d82-4599-aa8c-536aa8bb75cb","Type":"ContainerDied","Data":"c1df8ce39a79beb7d20a7250b681031a8be1ee60ba0757db2c102975dc50d173"} Oct 02 13:18:38 crc kubenswrapper[4724]: I1002 13:18:38.385579 4724 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-single-1" Oct 02 13:18:38 crc kubenswrapper[4724]: I1002 13:18:38.385590 4724 scope.go:117] "RemoveContainer" containerID="c929a03f5c282a9b52b41fb6075a9e750aff3165c96c8036a69473c1dc613787" Oct 02 13:18:38 crc kubenswrapper[4724]: I1002 13:18:38.386708 4724 reconciler_common.go:293] "Volume detached for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") on node \"crc\" DevicePath \"\"" Oct 02 13:18:38 crc kubenswrapper[4724]: I1002 13:18:38.386722 4724 reconciler_common.go:293] "Volume detached for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") on node \"crc\" DevicePath \"\"" Oct 02 13:18:38 crc kubenswrapper[4724]: I1002 13:18:38.389054 4724 generic.go:334] "Generic (PLEG): container finished" podID="cc12278a-3ee7-4395-a462-163666b30769" containerID="2dd30b1b025105749a2c05184515332aafd2853313c576ba111850fdc2367956" exitCode=0 Oct 02 13:18:38 crc kubenswrapper[4724]: I1002 13:18:38.389073 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-0" event={"ID":"cc12278a-3ee7-4395-a462-163666b30769","Type":"ContainerDied","Data":"2dd30b1b025105749a2c05184515332aafd2853313c576ba111850fdc2367956"} Oct 02 13:18:38 crc kubenswrapper[4724]: I1002 13:18:38.408788 4724 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-default-single-1"] Oct 02 13:18:38 crc kubenswrapper[4724]: I1002 13:18:38.421195 4724 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-single-0" Oct 02 13:18:38 crc kubenswrapper[4724]: I1002 13:18:38.428384 4724 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/glance-default-single-1"] Oct 02 13:18:38 crc kubenswrapper[4724]: I1002 13:18:38.428757 4724 scope.go:117] "RemoveContainer" containerID="5bff29cbf45dd297d838cb4eb017ed7acee29ae1497066d7132b3d3b90443663" Oct 02 13:18:38 crc kubenswrapper[4724]: I1002 13:18:38.475154 4724 scope.go:117] "RemoveContainer" containerID="c929a03f5c282a9b52b41fb6075a9e750aff3165c96c8036a69473c1dc613787" Oct 02 13:18:38 crc kubenswrapper[4724]: E1002 13:18:38.475723 4724 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c929a03f5c282a9b52b41fb6075a9e750aff3165c96c8036a69473c1dc613787\": container with ID starting with c929a03f5c282a9b52b41fb6075a9e750aff3165c96c8036a69473c1dc613787 not found: ID does not exist" containerID="c929a03f5c282a9b52b41fb6075a9e750aff3165c96c8036a69473c1dc613787" Oct 02 13:18:38 crc kubenswrapper[4724]: I1002 13:18:38.475776 4724 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c929a03f5c282a9b52b41fb6075a9e750aff3165c96c8036a69473c1dc613787"} err="failed to get container status \"c929a03f5c282a9b52b41fb6075a9e750aff3165c96c8036a69473c1dc613787\": rpc error: code = NotFound desc = could not find container \"c929a03f5c282a9b52b41fb6075a9e750aff3165c96c8036a69473c1dc613787\": container with ID starting with c929a03f5c282a9b52b41fb6075a9e750aff3165c96c8036a69473c1dc613787 not found: ID does not exist" Oct 02 13:18:38 crc kubenswrapper[4724]: I1002 13:18:38.475798 4724 scope.go:117] "RemoveContainer" containerID="5bff29cbf45dd297d838cb4eb017ed7acee29ae1497066d7132b3d3b90443663" Oct 02 13:18:38 crc kubenswrapper[4724]: E1002 13:18:38.476024 4724 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5bff29cbf45dd297d838cb4eb017ed7acee29ae1497066d7132b3d3b90443663\": container with ID starting with 5bff29cbf45dd297d838cb4eb017ed7acee29ae1497066d7132b3d3b90443663 not found: ID does not exist" containerID="5bff29cbf45dd297d838cb4eb017ed7acee29ae1497066d7132b3d3b90443663" Oct 02 13:18:38 crc kubenswrapper[4724]: I1002 13:18:38.476044 4724 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5bff29cbf45dd297d838cb4eb017ed7acee29ae1497066d7132b3d3b90443663"} err="failed to get container status \"5bff29cbf45dd297d838cb4eb017ed7acee29ae1497066d7132b3d3b90443663\": rpc error: code = NotFound desc = could not find container \"5bff29cbf45dd297d838cb4eb017ed7acee29ae1497066d7132b3d3b90443663\": container with ID starting with 5bff29cbf45dd297d838cb4eb017ed7acee29ae1497066d7132b3d3b90443663 not found: ID does not exist" Oct 02 13:18:38 crc kubenswrapper[4724]: I1002 13:18:38.487885 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/cc12278a-3ee7-4395-a462-163666b30769-etc-nvme\") pod \"cc12278a-3ee7-4395-a462-163666b30769\" (UID: \"cc12278a-3ee7-4395-a462-163666b30769\") " Oct 02 13:18:38 crc kubenswrapper[4724]: I1002 13:18:38.487954 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cc12278a-3ee7-4395-a462-163666b30769-logs\") pod \"cc12278a-3ee7-4395-a462-163666b30769\" (UID: \"cc12278a-3ee7-4395-a462-163666b30769\") " Oct 02 13:18:38 crc kubenswrapper[4724]: I1002 13:18:38.487981 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/cc12278a-3ee7-4395-a462-163666b30769-var-locks-brick\") pod \"cc12278a-3ee7-4395-a462-163666b30769\" (UID: \"cc12278a-3ee7-4395-a462-163666b30769\") " Oct 02 13:18:38 crc kubenswrapper[4724]: I1002 13:18:38.488006 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/cc12278a-3ee7-4395-a462-163666b30769-lib-modules\") pod \"cc12278a-3ee7-4395-a462-163666b30769\" (UID: \"cc12278a-3ee7-4395-a462-163666b30769\") " Oct 02 13:18:38 crc kubenswrapper[4724]: I1002 13:18:38.488050 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/cc12278a-3ee7-4395-a462-163666b30769-sys\") pod \"cc12278a-3ee7-4395-a462-163666b30769\" (UID: \"cc12278a-3ee7-4395-a462-163666b30769\") " Oct 02 13:18:38 crc kubenswrapper[4724]: I1002 13:18:38.488170 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance-cache\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"cc12278a-3ee7-4395-a462-163666b30769\" (UID: \"cc12278a-3ee7-4395-a462-163666b30769\") " Oct 02 13:18:38 crc kubenswrapper[4724]: I1002 13:18:38.488195 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cc12278a-3ee7-4395-a462-163666b30769-config-data\") pod \"cc12278a-3ee7-4395-a462-163666b30769\" (UID: \"cc12278a-3ee7-4395-a462-163666b30769\") " Oct 02 13:18:38 crc kubenswrapper[4724]: I1002 13:18:38.488212 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/cc12278a-3ee7-4395-a462-163666b30769-run\") pod \"cc12278a-3ee7-4395-a462-163666b30769\" (UID: \"cc12278a-3ee7-4395-a462-163666b30769\") " Oct 02 13:18:38 crc kubenswrapper[4724]: I1002 13:18:38.488234 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/cc12278a-3ee7-4395-a462-163666b30769-etc-iscsi\") pod \"cc12278a-3ee7-4395-a462-163666b30769\" (UID: \"cc12278a-3ee7-4395-a462-163666b30769\") " Oct 02 13:18:38 crc kubenswrapper[4724]: I1002 13:18:38.488268 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d2znf\" (UniqueName: \"kubernetes.io/projected/cc12278a-3ee7-4395-a462-163666b30769-kube-api-access-d2znf\") pod \"cc12278a-3ee7-4395-a462-163666b30769\" (UID: \"cc12278a-3ee7-4395-a462-163666b30769\") " Oct 02 13:18:38 crc kubenswrapper[4724]: I1002 13:18:38.488296 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"cc12278a-3ee7-4395-a462-163666b30769\" (UID: \"cc12278a-3ee7-4395-a462-163666b30769\") " Oct 02 13:18:38 crc kubenswrapper[4724]: I1002 13:18:38.488322 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cc12278a-3ee7-4395-a462-163666b30769-scripts\") pod \"cc12278a-3ee7-4395-a462-163666b30769\" (UID: \"cc12278a-3ee7-4395-a462-163666b30769\") " Oct 02 13:18:38 crc kubenswrapper[4724]: I1002 13:18:38.488344 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/cc12278a-3ee7-4395-a462-163666b30769-dev\") pod \"cc12278a-3ee7-4395-a462-163666b30769\" (UID: \"cc12278a-3ee7-4395-a462-163666b30769\") " Oct 02 13:18:38 crc kubenswrapper[4724]: I1002 13:18:38.488359 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/cc12278a-3ee7-4395-a462-163666b30769-httpd-run\") pod \"cc12278a-3ee7-4395-a462-163666b30769\" (UID: \"cc12278a-3ee7-4395-a462-163666b30769\") " Oct 02 13:18:38 crc kubenswrapper[4724]: I1002 13:18:38.488934 4724 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cc12278a-3ee7-4395-a462-163666b30769-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "cc12278a-3ee7-4395-a462-163666b30769" (UID: "cc12278a-3ee7-4395-a462-163666b30769"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 13:18:38 crc kubenswrapper[4724]: I1002 13:18:38.488977 4724 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/cc12278a-3ee7-4395-a462-163666b30769-etc-nvme" (OuterVolumeSpecName: "etc-nvme") pod "cc12278a-3ee7-4395-a462-163666b30769" (UID: "cc12278a-3ee7-4395-a462-163666b30769"). InnerVolumeSpecName "etc-nvme". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 02 13:18:38 crc kubenswrapper[4724]: I1002 13:18:38.489270 4724 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cc12278a-3ee7-4395-a462-163666b30769-logs" (OuterVolumeSpecName: "logs") pod "cc12278a-3ee7-4395-a462-163666b30769" (UID: "cc12278a-3ee7-4395-a462-163666b30769"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 13:18:38 crc kubenswrapper[4724]: I1002 13:18:38.489309 4724 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/cc12278a-3ee7-4395-a462-163666b30769-var-locks-brick" (OuterVolumeSpecName: "var-locks-brick") pod "cc12278a-3ee7-4395-a462-163666b30769" (UID: "cc12278a-3ee7-4395-a462-163666b30769"). InnerVolumeSpecName "var-locks-brick". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 02 13:18:38 crc kubenswrapper[4724]: I1002 13:18:38.489336 4724 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/cc12278a-3ee7-4395-a462-163666b30769-lib-modules" (OuterVolumeSpecName: "lib-modules") pod "cc12278a-3ee7-4395-a462-163666b30769" (UID: "cc12278a-3ee7-4395-a462-163666b30769"). InnerVolumeSpecName "lib-modules". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 02 13:18:38 crc kubenswrapper[4724]: I1002 13:18:38.489353 4724 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/cc12278a-3ee7-4395-a462-163666b30769-sys" (OuterVolumeSpecName: "sys") pod "cc12278a-3ee7-4395-a462-163666b30769" (UID: "cc12278a-3ee7-4395-a462-163666b30769"). InnerVolumeSpecName "sys". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 02 13:18:38 crc kubenswrapper[4724]: I1002 13:18:38.489776 4724 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/cc12278a-3ee7-4395-a462-163666b30769-dev" (OuterVolumeSpecName: "dev") pod "cc12278a-3ee7-4395-a462-163666b30769" (UID: "cc12278a-3ee7-4395-a462-163666b30769"). InnerVolumeSpecName "dev". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 02 13:18:38 crc kubenswrapper[4724]: I1002 13:18:38.489851 4724 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/cc12278a-3ee7-4395-a462-163666b30769-etc-iscsi" (OuterVolumeSpecName: "etc-iscsi") pod "cc12278a-3ee7-4395-a462-163666b30769" (UID: "cc12278a-3ee7-4395-a462-163666b30769"). InnerVolumeSpecName "etc-iscsi". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 02 13:18:38 crc kubenswrapper[4724]: I1002 13:18:38.489935 4724 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/cc12278a-3ee7-4395-a462-163666b30769-run" (OuterVolumeSpecName: "run") pod "cc12278a-3ee7-4395-a462-163666b30769" (UID: "cc12278a-3ee7-4395-a462-163666b30769"). InnerVolumeSpecName "run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 02 13:18:38 crc kubenswrapper[4724]: I1002 13:18:38.495998 4724 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage09-crc" (OuterVolumeSpecName: "glance-cache") pod "cc12278a-3ee7-4395-a462-163666b30769" (UID: "cc12278a-3ee7-4395-a462-163666b30769"). InnerVolumeSpecName "local-storage09-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 02 13:18:38 crc kubenswrapper[4724]: I1002 13:18:38.496079 4724 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cc12278a-3ee7-4395-a462-163666b30769-kube-api-access-d2znf" (OuterVolumeSpecName: "kube-api-access-d2znf") pod "cc12278a-3ee7-4395-a462-163666b30769" (UID: "cc12278a-3ee7-4395-a462-163666b30769"). InnerVolumeSpecName "kube-api-access-d2znf". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 13:18:38 crc kubenswrapper[4724]: I1002 13:18:38.497929 4724 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage06-crc" (OuterVolumeSpecName: "glance") pod "cc12278a-3ee7-4395-a462-163666b30769" (UID: "cc12278a-3ee7-4395-a462-163666b30769"). InnerVolumeSpecName "local-storage06-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 02 13:18:38 crc kubenswrapper[4724]: I1002 13:18:38.498102 4724 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cc12278a-3ee7-4395-a462-163666b30769-scripts" (OuterVolumeSpecName: "scripts") pod "cc12278a-3ee7-4395-a462-163666b30769" (UID: "cc12278a-3ee7-4395-a462-163666b30769"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 13:18:38 crc kubenswrapper[4724]: I1002 13:18:38.529021 4724 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cc12278a-3ee7-4395-a462-163666b30769-config-data" (OuterVolumeSpecName: "config-data") pod "cc12278a-3ee7-4395-a462-163666b30769" (UID: "cc12278a-3ee7-4395-a462-163666b30769"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 13:18:38 crc kubenswrapper[4724]: I1002 13:18:38.590087 4724 reconciler_common.go:293] "Volume detached for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/cc12278a-3ee7-4395-a462-163666b30769-etc-nvme\") on node \"crc\" DevicePath \"\"" Oct 02 13:18:38 crc kubenswrapper[4724]: I1002 13:18:38.590117 4724 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cc12278a-3ee7-4395-a462-163666b30769-logs\") on node \"crc\" DevicePath \"\"" Oct 02 13:18:38 crc kubenswrapper[4724]: I1002 13:18:38.590125 4724 reconciler_common.go:293] "Volume detached for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/cc12278a-3ee7-4395-a462-163666b30769-var-locks-brick\") on node \"crc\" DevicePath \"\"" Oct 02 13:18:38 crc kubenswrapper[4724]: I1002 13:18:38.590136 4724 reconciler_common.go:293] "Volume detached for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/cc12278a-3ee7-4395-a462-163666b30769-lib-modules\") on node \"crc\" DevicePath \"\"" Oct 02 13:18:38 crc kubenswrapper[4724]: I1002 13:18:38.590144 4724 reconciler_common.go:293] "Volume detached for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/cc12278a-3ee7-4395-a462-163666b30769-sys\") on node \"crc\" DevicePath \"\"" Oct 02 13:18:38 crc kubenswrapper[4724]: I1002 13:18:38.590183 4724 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" " Oct 02 13:18:38 crc kubenswrapper[4724]: I1002 13:18:38.590195 4724 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cc12278a-3ee7-4395-a462-163666b30769-config-data\") on node \"crc\" DevicePath \"\"" Oct 02 13:18:38 crc kubenswrapper[4724]: I1002 13:18:38.590208 4724 reconciler_common.go:293] "Volume detached for volume \"run\" (UniqueName: \"kubernetes.io/host-path/cc12278a-3ee7-4395-a462-163666b30769-run\") on node \"crc\" DevicePath \"\"" Oct 02 13:18:38 crc kubenswrapper[4724]: I1002 13:18:38.590219 4724 reconciler_common.go:293] "Volume detached for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/cc12278a-3ee7-4395-a462-163666b30769-etc-iscsi\") on node \"crc\" DevicePath \"\"" Oct 02 13:18:38 crc kubenswrapper[4724]: I1002 13:18:38.590231 4724 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d2znf\" (UniqueName: \"kubernetes.io/projected/cc12278a-3ee7-4395-a462-163666b30769-kube-api-access-d2znf\") on node \"crc\" DevicePath \"\"" Oct 02 13:18:38 crc kubenswrapper[4724]: I1002 13:18:38.590257 4724 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" " Oct 02 13:18:38 crc kubenswrapper[4724]: I1002 13:18:38.590269 4724 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cc12278a-3ee7-4395-a462-163666b30769-scripts\") on node \"crc\" DevicePath \"\"" Oct 02 13:18:38 crc kubenswrapper[4724]: I1002 13:18:38.590279 4724 reconciler_common.go:293] "Volume detached for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/cc12278a-3ee7-4395-a462-163666b30769-dev\") on node \"crc\" DevicePath \"\"" Oct 02 13:18:38 crc kubenswrapper[4724]: I1002 13:18:38.590296 4724 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/cc12278a-3ee7-4395-a462-163666b30769-httpd-run\") on node \"crc\" DevicePath \"\"" Oct 02 13:18:38 crc kubenswrapper[4724]: I1002 13:18:38.603718 4724 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage09-crc" (UniqueName: "kubernetes.io/local-volume/local-storage09-crc") on node "crc" Oct 02 13:18:38 crc kubenswrapper[4724]: I1002 13:18:38.604240 4724 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage06-crc" (UniqueName: "kubernetes.io/local-volume/local-storage06-crc") on node "crc" Oct 02 13:18:38 crc kubenswrapper[4724]: I1002 13:18:38.691998 4724 reconciler_common.go:293] "Volume detached for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" DevicePath \"\"" Oct 02 13:18:38 crc kubenswrapper[4724]: I1002 13:18:38.692050 4724 reconciler_common.go:293] "Volume detached for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" DevicePath \"\"" Oct 02 13:18:39 crc kubenswrapper[4724]: I1002 13:18:39.400242 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-0" event={"ID":"cc12278a-3ee7-4395-a462-163666b30769","Type":"ContainerDied","Data":"6ac1806db3050a9a67fbceefd913e380edf98e8c411640100daf36db12086f29"} Oct 02 13:18:39 crc kubenswrapper[4724]: I1002 13:18:39.400303 4724 scope.go:117] "RemoveContainer" containerID="2dd30b1b025105749a2c05184515332aafd2853313c576ba111850fdc2367956" Oct 02 13:18:39 crc kubenswrapper[4724]: I1002 13:18:39.401505 4724 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-single-0" Oct 02 13:18:39 crc kubenswrapper[4724]: I1002 13:18:39.432066 4724 scope.go:117] "RemoveContainer" containerID="a4f1f84713982de5c8cc2f8dbf766c6a1fc5fc4a318619eb3b1d0424c9b459d5" Oct 02 13:18:39 crc kubenswrapper[4724]: I1002 13:18:39.462644 4724 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-default-single-0"] Oct 02 13:18:39 crc kubenswrapper[4724]: I1002 13:18:39.472263 4724 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/glance-default-single-0"] Oct 02 13:18:39 crc kubenswrapper[4724]: I1002 13:18:39.613227 4724 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-db-create-sqnjs"] Oct 02 13:18:39 crc kubenswrapper[4724]: I1002 13:18:39.620880 4724 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/glance-db-create-sqnjs"] Oct 02 13:18:39 crc kubenswrapper[4724]: I1002 13:18:39.626580 4724 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glancef957-account-delete-wwll8"] Oct 02 13:18:39 crc kubenswrapper[4724]: I1002 13:18:39.630940 4724 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/glancef957-account-delete-wwll8"] Oct 02 13:18:39 crc kubenswrapper[4724]: I1002 13:18:39.634915 4724 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-f957-account-create-pxkch"] Oct 02 13:18:39 crc kubenswrapper[4724]: I1002 13:18:39.638919 4724 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/glance-f957-account-create-pxkch"] Oct 02 13:18:40 crc kubenswrapper[4724]: I1002 13:18:40.168496 4724 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/glance-db-create-q9h5p"] Oct 02 13:18:40 crc kubenswrapper[4724]: E1002 13:18:40.168904 4724 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cc12278a-3ee7-4395-a462-163666b30769" containerName="glance-log" Oct 02 13:18:40 crc kubenswrapper[4724]: I1002 13:18:40.168923 4724 state_mem.go:107] "Deleted CPUSet assignment" podUID="cc12278a-3ee7-4395-a462-163666b30769" containerName="glance-log" Oct 02 13:18:40 crc kubenswrapper[4724]: E1002 13:18:40.168940 4724 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7de12d0c-65ee-44be-80d9-3b3256e71ada" containerName="openstackclient" Oct 02 13:18:40 crc kubenswrapper[4724]: I1002 13:18:40.168949 4724 state_mem.go:107] "Deleted CPUSet assignment" podUID="7de12d0c-65ee-44be-80d9-3b3256e71ada" containerName="openstackclient" Oct 02 13:18:40 crc kubenswrapper[4724]: E1002 13:18:40.168962 4724 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5d1d50fa-2407-4b6d-82e8-22ed9f5a02ee" containerName="mariadb-account-delete" Oct 02 13:18:40 crc kubenswrapper[4724]: I1002 13:18:40.168971 4724 state_mem.go:107] "Deleted CPUSet assignment" podUID="5d1d50fa-2407-4b6d-82e8-22ed9f5a02ee" containerName="mariadb-account-delete" Oct 02 13:18:40 crc kubenswrapper[4724]: E1002 13:18:40.168991 4724 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9b9e3dca-3d82-4599-aa8c-536aa8bb75cb" containerName="glance-log" Oct 02 13:18:40 crc kubenswrapper[4724]: I1002 13:18:40.168998 4724 state_mem.go:107] "Deleted CPUSet assignment" podUID="9b9e3dca-3d82-4599-aa8c-536aa8bb75cb" containerName="glance-log" Oct 02 13:18:40 crc kubenswrapper[4724]: E1002 13:18:40.169007 4724 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cc12278a-3ee7-4395-a462-163666b30769" containerName="glance-httpd" Oct 02 13:18:40 crc kubenswrapper[4724]: I1002 13:18:40.169014 4724 state_mem.go:107] "Deleted CPUSet assignment" podUID="cc12278a-3ee7-4395-a462-163666b30769" containerName="glance-httpd" Oct 02 13:18:40 crc kubenswrapper[4724]: E1002 13:18:40.169032 4724 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9b9e3dca-3d82-4599-aa8c-536aa8bb75cb" containerName="glance-httpd" Oct 02 13:18:40 crc kubenswrapper[4724]: I1002 13:18:40.169040 4724 state_mem.go:107] "Deleted CPUSet assignment" podUID="9b9e3dca-3d82-4599-aa8c-536aa8bb75cb" containerName="glance-httpd" Oct 02 13:18:40 crc kubenswrapper[4724]: I1002 13:18:40.169184 4724 memory_manager.go:354] "RemoveStaleState removing state" podUID="5d1d50fa-2407-4b6d-82e8-22ed9f5a02ee" containerName="mariadb-account-delete" Oct 02 13:18:40 crc kubenswrapper[4724]: I1002 13:18:40.169198 4724 memory_manager.go:354] "RemoveStaleState removing state" podUID="cc12278a-3ee7-4395-a462-163666b30769" containerName="glance-httpd" Oct 02 13:18:40 crc kubenswrapper[4724]: I1002 13:18:40.169207 4724 memory_manager.go:354] "RemoveStaleState removing state" podUID="9b9e3dca-3d82-4599-aa8c-536aa8bb75cb" containerName="glance-httpd" Oct 02 13:18:40 crc kubenswrapper[4724]: I1002 13:18:40.169216 4724 memory_manager.go:354] "RemoveStaleState removing state" podUID="cc12278a-3ee7-4395-a462-163666b30769" containerName="glance-log" Oct 02 13:18:40 crc kubenswrapper[4724]: I1002 13:18:40.169227 4724 memory_manager.go:354] "RemoveStaleState removing state" podUID="7de12d0c-65ee-44be-80d9-3b3256e71ada" containerName="openstackclient" Oct 02 13:18:40 crc kubenswrapper[4724]: I1002 13:18:40.169247 4724 memory_manager.go:354] "RemoveStaleState removing state" podUID="9b9e3dca-3d82-4599-aa8c-536aa8bb75cb" containerName="glance-log" Oct 02 13:18:40 crc kubenswrapper[4724]: I1002 13:18:40.169917 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-db-create-q9h5p" Oct 02 13:18:40 crc kubenswrapper[4724]: I1002 13:18:40.176984 4724 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-db-create-q9h5p"] Oct 02 13:18:40 crc kubenswrapper[4724]: I1002 13:18:40.213151 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gj2z2\" (UniqueName: \"kubernetes.io/projected/6fb35493-0f64-406c-9997-3c0a5a47c8bb-kube-api-access-gj2z2\") pod \"glance-db-create-q9h5p\" (UID: \"6fb35493-0f64-406c-9997-3c0a5a47c8bb\") " pod="glance-kuttl-tests/glance-db-create-q9h5p" Oct 02 13:18:40 crc kubenswrapper[4724]: I1002 13:18:40.313961 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gj2z2\" (UniqueName: \"kubernetes.io/projected/6fb35493-0f64-406c-9997-3c0a5a47c8bb-kube-api-access-gj2z2\") pod \"glance-db-create-q9h5p\" (UID: \"6fb35493-0f64-406c-9997-3c0a5a47c8bb\") " pod="glance-kuttl-tests/glance-db-create-q9h5p" Oct 02 13:18:40 crc kubenswrapper[4724]: I1002 13:18:40.323949 4724 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5d1d50fa-2407-4b6d-82e8-22ed9f5a02ee" path="/var/lib/kubelet/pods/5d1d50fa-2407-4b6d-82e8-22ed9f5a02ee/volumes" Oct 02 13:18:40 crc kubenswrapper[4724]: I1002 13:18:40.324993 4724 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9b9e3dca-3d82-4599-aa8c-536aa8bb75cb" path="/var/lib/kubelet/pods/9b9e3dca-3d82-4599-aa8c-536aa8bb75cb/volumes" Oct 02 13:18:40 crc kubenswrapper[4724]: I1002 13:18:40.325688 4724 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="be0e82aa-a6fd-4bfb-925e-573f0a6b960a" path="/var/lib/kubelet/pods/be0e82aa-a6fd-4bfb-925e-573f0a6b960a/volumes" Oct 02 13:18:40 crc kubenswrapper[4724]: I1002 13:18:40.326998 4724 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c45996f0-02dc-44bc-af15-c6e272ccc2f8" path="/var/lib/kubelet/pods/c45996f0-02dc-44bc-af15-c6e272ccc2f8/volumes" Oct 02 13:18:40 crc kubenswrapper[4724]: I1002 13:18:40.327726 4724 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cc12278a-3ee7-4395-a462-163666b30769" path="/var/lib/kubelet/pods/cc12278a-3ee7-4395-a462-163666b30769/volumes" Oct 02 13:18:40 crc kubenswrapper[4724]: I1002 13:18:40.336673 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gj2z2\" (UniqueName: \"kubernetes.io/projected/6fb35493-0f64-406c-9997-3c0a5a47c8bb-kube-api-access-gj2z2\") pod \"glance-db-create-q9h5p\" (UID: \"6fb35493-0f64-406c-9997-3c0a5a47c8bb\") " pod="glance-kuttl-tests/glance-db-create-q9h5p" Oct 02 13:18:40 crc kubenswrapper[4724]: I1002 13:18:40.489102 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-db-create-q9h5p" Oct 02 13:18:40 crc kubenswrapper[4724]: I1002 13:18:40.711508 4724 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-db-create-q9h5p"] Oct 02 13:18:41 crc kubenswrapper[4724]: I1002 13:18:41.425025 4724 generic.go:334] "Generic (PLEG): container finished" podID="6fb35493-0f64-406c-9997-3c0a5a47c8bb" containerID="0ea2f0515db05054b992d352e760b563d7394d0f342ef6f49397c4bc7e9e437d" exitCode=0 Oct 02 13:18:41 crc kubenswrapper[4724]: I1002 13:18:41.425136 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-db-create-q9h5p" event={"ID":"6fb35493-0f64-406c-9997-3c0a5a47c8bb","Type":"ContainerDied","Data":"0ea2f0515db05054b992d352e760b563d7394d0f342ef6f49397c4bc7e9e437d"} Oct 02 13:18:41 crc kubenswrapper[4724]: I1002 13:18:41.425414 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-db-create-q9h5p" event={"ID":"6fb35493-0f64-406c-9997-3c0a5a47c8bb","Type":"ContainerStarted","Data":"bc2fede19cda8c42cedb8ab7e6223e50f4bf5b0d50eb5bbfc4c9a9463ade2373"} Oct 02 13:18:42 crc kubenswrapper[4724]: I1002 13:18:42.750932 4724 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-db-create-q9h5p" Oct 02 13:18:42 crc kubenswrapper[4724]: I1002 13:18:42.955000 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gj2z2\" (UniqueName: \"kubernetes.io/projected/6fb35493-0f64-406c-9997-3c0a5a47c8bb-kube-api-access-gj2z2\") pod \"6fb35493-0f64-406c-9997-3c0a5a47c8bb\" (UID: \"6fb35493-0f64-406c-9997-3c0a5a47c8bb\") " Oct 02 13:18:42 crc kubenswrapper[4724]: I1002 13:18:42.966839 4724 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6fb35493-0f64-406c-9997-3c0a5a47c8bb-kube-api-access-gj2z2" (OuterVolumeSpecName: "kube-api-access-gj2z2") pod "6fb35493-0f64-406c-9997-3c0a5a47c8bb" (UID: "6fb35493-0f64-406c-9997-3c0a5a47c8bb"). InnerVolumeSpecName "kube-api-access-gj2z2". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 13:18:43 crc kubenswrapper[4724]: I1002 13:18:43.057424 4724 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gj2z2\" (UniqueName: \"kubernetes.io/projected/6fb35493-0f64-406c-9997-3c0a5a47c8bb-kube-api-access-gj2z2\") on node \"crc\" DevicePath \"\"" Oct 02 13:18:43 crc kubenswrapper[4724]: I1002 13:18:43.448636 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-db-create-q9h5p" event={"ID":"6fb35493-0f64-406c-9997-3c0a5a47c8bb","Type":"ContainerDied","Data":"bc2fede19cda8c42cedb8ab7e6223e50f4bf5b0d50eb5bbfc4c9a9463ade2373"} Oct 02 13:18:43 crc kubenswrapper[4724]: I1002 13:18:43.449008 4724 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bc2fede19cda8c42cedb8ab7e6223e50f4bf5b0d50eb5bbfc4c9a9463ade2373" Oct 02 13:18:43 crc kubenswrapper[4724]: I1002 13:18:43.448739 4724 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-db-create-q9h5p" Oct 02 13:18:50 crc kubenswrapper[4724]: I1002 13:18:50.192062 4724 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/glance-3a8a-account-create-kzm2c"] Oct 02 13:18:50 crc kubenswrapper[4724]: E1002 13:18:50.192947 4724 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6fb35493-0f64-406c-9997-3c0a5a47c8bb" containerName="mariadb-database-create" Oct 02 13:18:50 crc kubenswrapper[4724]: I1002 13:18:50.192962 4724 state_mem.go:107] "Deleted CPUSet assignment" podUID="6fb35493-0f64-406c-9997-3c0a5a47c8bb" containerName="mariadb-database-create" Oct 02 13:18:50 crc kubenswrapper[4724]: I1002 13:18:50.193086 4724 memory_manager.go:354] "RemoveStaleState removing state" podUID="6fb35493-0f64-406c-9997-3c0a5a47c8bb" containerName="mariadb-database-create" Oct 02 13:18:50 crc kubenswrapper[4724]: I1002 13:18:50.193578 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-3a8a-account-create-kzm2c" Oct 02 13:18:50 crc kubenswrapper[4724]: I1002 13:18:50.197177 4724 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"glance-db-secret" Oct 02 13:18:50 crc kubenswrapper[4724]: I1002 13:18:50.209241 4724 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-3a8a-account-create-kzm2c"] Oct 02 13:18:50 crc kubenswrapper[4724]: I1002 13:18:50.368217 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5j5xd\" (UniqueName: \"kubernetes.io/projected/d769764e-3687-4760-94eb-3516b6dbbaa1-kube-api-access-5j5xd\") pod \"glance-3a8a-account-create-kzm2c\" (UID: \"d769764e-3687-4760-94eb-3516b6dbbaa1\") " pod="glance-kuttl-tests/glance-3a8a-account-create-kzm2c" Oct 02 13:18:50 crc kubenswrapper[4724]: I1002 13:18:50.470207 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5j5xd\" (UniqueName: \"kubernetes.io/projected/d769764e-3687-4760-94eb-3516b6dbbaa1-kube-api-access-5j5xd\") pod \"glance-3a8a-account-create-kzm2c\" (UID: \"d769764e-3687-4760-94eb-3516b6dbbaa1\") " pod="glance-kuttl-tests/glance-3a8a-account-create-kzm2c" Oct 02 13:18:50 crc kubenswrapper[4724]: I1002 13:18:50.492008 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5j5xd\" (UniqueName: \"kubernetes.io/projected/d769764e-3687-4760-94eb-3516b6dbbaa1-kube-api-access-5j5xd\") pod \"glance-3a8a-account-create-kzm2c\" (UID: \"d769764e-3687-4760-94eb-3516b6dbbaa1\") " pod="glance-kuttl-tests/glance-3a8a-account-create-kzm2c" Oct 02 13:18:50 crc kubenswrapper[4724]: I1002 13:18:50.524796 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-3a8a-account-create-kzm2c" Oct 02 13:18:50 crc kubenswrapper[4724]: I1002 13:18:50.965989 4724 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-3a8a-account-create-kzm2c"] Oct 02 13:18:51 crc kubenswrapper[4724]: I1002 13:18:51.525462 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-3a8a-account-create-kzm2c" event={"ID":"d769764e-3687-4760-94eb-3516b6dbbaa1","Type":"ContainerStarted","Data":"ba9cacc4608bb53e03730519aefed01612414d7fb19cb85cab79a12b7368f1c9"} Oct 02 13:18:51 crc kubenswrapper[4724]: I1002 13:18:51.525870 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-3a8a-account-create-kzm2c" event={"ID":"d769764e-3687-4760-94eb-3516b6dbbaa1","Type":"ContainerStarted","Data":"679c9cd412462999269e9697e8e0468b3dfbbbcf3c1873f50267d9795bf35b70"} Oct 02 13:18:51 crc kubenswrapper[4724]: E1002 13:18:51.963291 4724 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd769764e_3687_4760_94eb_3516b6dbbaa1.slice/crio-ba9cacc4608bb53e03730519aefed01612414d7fb19cb85cab79a12b7368f1c9.scope\": RecentStats: unable to find data in memory cache]" Oct 02 13:18:52 crc kubenswrapper[4724]: I1002 13:18:52.536773 4724 generic.go:334] "Generic (PLEG): container finished" podID="d769764e-3687-4760-94eb-3516b6dbbaa1" containerID="ba9cacc4608bb53e03730519aefed01612414d7fb19cb85cab79a12b7368f1c9" exitCode=0 Oct 02 13:18:52 crc kubenswrapper[4724]: I1002 13:18:52.536845 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-3a8a-account-create-kzm2c" event={"ID":"d769764e-3687-4760-94eb-3516b6dbbaa1","Type":"ContainerDied","Data":"ba9cacc4608bb53e03730519aefed01612414d7fb19cb85cab79a12b7368f1c9"} Oct 02 13:18:53 crc kubenswrapper[4724]: I1002 13:18:53.912196 4724 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-3a8a-account-create-kzm2c" Oct 02 13:18:54 crc kubenswrapper[4724]: I1002 13:18:54.025136 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5j5xd\" (UniqueName: \"kubernetes.io/projected/d769764e-3687-4760-94eb-3516b6dbbaa1-kube-api-access-5j5xd\") pod \"d769764e-3687-4760-94eb-3516b6dbbaa1\" (UID: \"d769764e-3687-4760-94eb-3516b6dbbaa1\") " Oct 02 13:18:54 crc kubenswrapper[4724]: I1002 13:18:54.033202 4724 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d769764e-3687-4760-94eb-3516b6dbbaa1-kube-api-access-5j5xd" (OuterVolumeSpecName: "kube-api-access-5j5xd") pod "d769764e-3687-4760-94eb-3516b6dbbaa1" (UID: "d769764e-3687-4760-94eb-3516b6dbbaa1"). InnerVolumeSpecName "kube-api-access-5j5xd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 13:18:54 crc kubenswrapper[4724]: I1002 13:18:54.127051 4724 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5j5xd\" (UniqueName: \"kubernetes.io/projected/d769764e-3687-4760-94eb-3516b6dbbaa1-kube-api-access-5j5xd\") on node \"crc\" DevicePath \"\"" Oct 02 13:18:54 crc kubenswrapper[4724]: I1002 13:18:54.576978 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-3a8a-account-create-kzm2c" event={"ID":"d769764e-3687-4760-94eb-3516b6dbbaa1","Type":"ContainerDied","Data":"679c9cd412462999269e9697e8e0468b3dfbbbcf3c1873f50267d9795bf35b70"} Oct 02 13:18:54 crc kubenswrapper[4724]: I1002 13:18:54.577043 4724 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="679c9cd412462999269e9697e8e0468b3dfbbbcf3c1873f50267d9795bf35b70" Oct 02 13:18:54 crc kubenswrapper[4724]: I1002 13:18:54.577140 4724 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-3a8a-account-create-kzm2c" Oct 02 13:18:55 crc kubenswrapper[4724]: I1002 13:18:55.247120 4724 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/glance-db-sync-m27n9"] Oct 02 13:18:55 crc kubenswrapper[4724]: E1002 13:18:55.247515 4724 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d769764e-3687-4760-94eb-3516b6dbbaa1" containerName="mariadb-account-create" Oct 02 13:18:55 crc kubenswrapper[4724]: I1002 13:18:55.247530 4724 state_mem.go:107] "Deleted CPUSet assignment" podUID="d769764e-3687-4760-94eb-3516b6dbbaa1" containerName="mariadb-account-create" Oct 02 13:18:55 crc kubenswrapper[4724]: I1002 13:18:55.247729 4724 memory_manager.go:354] "RemoveStaleState removing state" podUID="d769764e-3687-4760-94eb-3516b6dbbaa1" containerName="mariadb-account-create" Oct 02 13:18:55 crc kubenswrapper[4724]: I1002 13:18:55.248375 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-db-sync-m27n9" Oct 02 13:18:55 crc kubenswrapper[4724]: I1002 13:18:55.250903 4724 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"glance-glance-dockercfg-5t7ct" Oct 02 13:18:55 crc kubenswrapper[4724]: I1002 13:18:55.251372 4724 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"combined-ca-bundle" Oct 02 13:18:55 crc kubenswrapper[4724]: I1002 13:18:55.254310 4724 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"glance-config-data" Oct 02 13:18:55 crc kubenswrapper[4724]: I1002 13:18:55.266133 4724 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-db-sync-m27n9"] Oct 02 13:18:55 crc kubenswrapper[4724]: I1002 13:18:55.349144 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8hsfm\" (UniqueName: \"kubernetes.io/projected/0ea464d5-2ade-47ca-a61f-6cb445974084-kube-api-access-8hsfm\") pod \"glance-db-sync-m27n9\" (UID: \"0ea464d5-2ade-47ca-a61f-6cb445974084\") " pod="glance-kuttl-tests/glance-db-sync-m27n9" Oct 02 13:18:55 crc kubenswrapper[4724]: I1002 13:18:55.349474 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0ea464d5-2ade-47ca-a61f-6cb445974084-combined-ca-bundle\") pod \"glance-db-sync-m27n9\" (UID: \"0ea464d5-2ade-47ca-a61f-6cb445974084\") " pod="glance-kuttl-tests/glance-db-sync-m27n9" Oct 02 13:18:55 crc kubenswrapper[4724]: I1002 13:18:55.349650 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0ea464d5-2ade-47ca-a61f-6cb445974084-config-data\") pod \"glance-db-sync-m27n9\" (UID: \"0ea464d5-2ade-47ca-a61f-6cb445974084\") " pod="glance-kuttl-tests/glance-db-sync-m27n9" Oct 02 13:18:55 crc kubenswrapper[4724]: I1002 13:18:55.349792 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/0ea464d5-2ade-47ca-a61f-6cb445974084-db-sync-config-data\") pod \"glance-db-sync-m27n9\" (UID: \"0ea464d5-2ade-47ca-a61f-6cb445974084\") " pod="glance-kuttl-tests/glance-db-sync-m27n9" Oct 02 13:18:55 crc kubenswrapper[4724]: I1002 13:18:55.451844 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8hsfm\" (UniqueName: \"kubernetes.io/projected/0ea464d5-2ade-47ca-a61f-6cb445974084-kube-api-access-8hsfm\") pod \"glance-db-sync-m27n9\" (UID: \"0ea464d5-2ade-47ca-a61f-6cb445974084\") " pod="glance-kuttl-tests/glance-db-sync-m27n9" Oct 02 13:18:55 crc kubenswrapper[4724]: I1002 13:18:55.452001 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0ea464d5-2ade-47ca-a61f-6cb445974084-combined-ca-bundle\") pod \"glance-db-sync-m27n9\" (UID: \"0ea464d5-2ade-47ca-a61f-6cb445974084\") " pod="glance-kuttl-tests/glance-db-sync-m27n9" Oct 02 13:18:55 crc kubenswrapper[4724]: I1002 13:18:55.452071 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0ea464d5-2ade-47ca-a61f-6cb445974084-config-data\") pod \"glance-db-sync-m27n9\" (UID: \"0ea464d5-2ade-47ca-a61f-6cb445974084\") " pod="glance-kuttl-tests/glance-db-sync-m27n9" Oct 02 13:18:55 crc kubenswrapper[4724]: I1002 13:18:55.452121 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/0ea464d5-2ade-47ca-a61f-6cb445974084-db-sync-config-data\") pod \"glance-db-sync-m27n9\" (UID: \"0ea464d5-2ade-47ca-a61f-6cb445974084\") " pod="glance-kuttl-tests/glance-db-sync-m27n9" Oct 02 13:18:55 crc kubenswrapper[4724]: I1002 13:18:55.457187 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0ea464d5-2ade-47ca-a61f-6cb445974084-config-data\") pod \"glance-db-sync-m27n9\" (UID: \"0ea464d5-2ade-47ca-a61f-6cb445974084\") " pod="glance-kuttl-tests/glance-db-sync-m27n9" Oct 02 13:18:55 crc kubenswrapper[4724]: I1002 13:18:55.462561 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0ea464d5-2ade-47ca-a61f-6cb445974084-combined-ca-bundle\") pod \"glance-db-sync-m27n9\" (UID: \"0ea464d5-2ade-47ca-a61f-6cb445974084\") " pod="glance-kuttl-tests/glance-db-sync-m27n9" Oct 02 13:18:55 crc kubenswrapper[4724]: I1002 13:18:55.464810 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/0ea464d5-2ade-47ca-a61f-6cb445974084-db-sync-config-data\") pod \"glance-db-sync-m27n9\" (UID: \"0ea464d5-2ade-47ca-a61f-6cb445974084\") " pod="glance-kuttl-tests/glance-db-sync-m27n9" Oct 02 13:18:55 crc kubenswrapper[4724]: I1002 13:18:55.495421 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8hsfm\" (UniqueName: \"kubernetes.io/projected/0ea464d5-2ade-47ca-a61f-6cb445974084-kube-api-access-8hsfm\") pod \"glance-db-sync-m27n9\" (UID: \"0ea464d5-2ade-47ca-a61f-6cb445974084\") " pod="glance-kuttl-tests/glance-db-sync-m27n9" Oct 02 13:18:55 crc kubenswrapper[4724]: I1002 13:18:55.566800 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-db-sync-m27n9" Oct 02 13:18:56 crc kubenswrapper[4724]: I1002 13:18:56.015728 4724 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-db-sync-m27n9"] Oct 02 13:18:56 crc kubenswrapper[4724]: W1002 13:18:56.023318 4724 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0ea464d5_2ade_47ca_a61f_6cb445974084.slice/crio-05dfd31c6b317a5c0fe6f118ff860648776cbe8af25b43dc5ee4746d36eb0a43 WatchSource:0}: Error finding container 05dfd31c6b317a5c0fe6f118ff860648776cbe8af25b43dc5ee4746d36eb0a43: Status 404 returned error can't find the container with id 05dfd31c6b317a5c0fe6f118ff860648776cbe8af25b43dc5ee4746d36eb0a43 Oct 02 13:18:56 crc kubenswrapper[4724]: I1002 13:18:56.594446 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-db-sync-m27n9" event={"ID":"0ea464d5-2ade-47ca-a61f-6cb445974084","Type":"ContainerStarted","Data":"204e46c1c7037466030eedd4eb34f20a40256857e4df30f98b58bc98a12fc1f7"} Oct 02 13:18:56 crc kubenswrapper[4724]: I1002 13:18:56.595285 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-db-sync-m27n9" event={"ID":"0ea464d5-2ade-47ca-a61f-6cb445974084","Type":"ContainerStarted","Data":"05dfd31c6b317a5c0fe6f118ff860648776cbe8af25b43dc5ee4746d36eb0a43"} Oct 02 13:18:57 crc kubenswrapper[4724]: I1002 13:18:57.622657 4724 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="glance-kuttl-tests/glance-db-sync-m27n9" podStartSLOduration=2.622631917 podStartE2EDuration="2.622631917s" podCreationTimestamp="2025-10-02 13:18:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 13:18:57.617017561 +0000 UTC m=+1202.071776692" watchObservedRunningTime="2025-10-02 13:18:57.622631917 +0000 UTC m=+1202.077391048" Oct 02 13:19:00 crc kubenswrapper[4724]: I1002 13:19:00.627235 4724 generic.go:334] "Generic (PLEG): container finished" podID="0ea464d5-2ade-47ca-a61f-6cb445974084" containerID="204e46c1c7037466030eedd4eb34f20a40256857e4df30f98b58bc98a12fc1f7" exitCode=0 Oct 02 13:19:00 crc kubenswrapper[4724]: I1002 13:19:00.627343 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-db-sync-m27n9" event={"ID":"0ea464d5-2ade-47ca-a61f-6cb445974084","Type":"ContainerDied","Data":"204e46c1c7037466030eedd4eb34f20a40256857e4df30f98b58bc98a12fc1f7"} Oct 02 13:19:01 crc kubenswrapper[4724]: I1002 13:19:01.906106 4724 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-db-sync-m27n9" Oct 02 13:19:01 crc kubenswrapper[4724]: I1002 13:19:01.952123 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/0ea464d5-2ade-47ca-a61f-6cb445974084-db-sync-config-data\") pod \"0ea464d5-2ade-47ca-a61f-6cb445974084\" (UID: \"0ea464d5-2ade-47ca-a61f-6cb445974084\") " Oct 02 13:19:01 crc kubenswrapper[4724]: I1002 13:19:01.952223 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8hsfm\" (UniqueName: \"kubernetes.io/projected/0ea464d5-2ade-47ca-a61f-6cb445974084-kube-api-access-8hsfm\") pod \"0ea464d5-2ade-47ca-a61f-6cb445974084\" (UID: \"0ea464d5-2ade-47ca-a61f-6cb445974084\") " Oct 02 13:19:01 crc kubenswrapper[4724]: I1002 13:19:01.952252 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0ea464d5-2ade-47ca-a61f-6cb445974084-combined-ca-bundle\") pod \"0ea464d5-2ade-47ca-a61f-6cb445974084\" (UID: \"0ea464d5-2ade-47ca-a61f-6cb445974084\") " Oct 02 13:19:01 crc kubenswrapper[4724]: I1002 13:19:01.952280 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0ea464d5-2ade-47ca-a61f-6cb445974084-config-data\") pod \"0ea464d5-2ade-47ca-a61f-6cb445974084\" (UID: \"0ea464d5-2ade-47ca-a61f-6cb445974084\") " Oct 02 13:19:01 crc kubenswrapper[4724]: I1002 13:19:01.957620 4724 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0ea464d5-2ade-47ca-a61f-6cb445974084-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "0ea464d5-2ade-47ca-a61f-6cb445974084" (UID: "0ea464d5-2ade-47ca-a61f-6cb445974084"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 13:19:01 crc kubenswrapper[4724]: I1002 13:19:01.957633 4724 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0ea464d5-2ade-47ca-a61f-6cb445974084-kube-api-access-8hsfm" (OuterVolumeSpecName: "kube-api-access-8hsfm") pod "0ea464d5-2ade-47ca-a61f-6cb445974084" (UID: "0ea464d5-2ade-47ca-a61f-6cb445974084"). InnerVolumeSpecName "kube-api-access-8hsfm". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 13:19:01 crc kubenswrapper[4724]: I1002 13:19:01.974459 4724 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0ea464d5-2ade-47ca-a61f-6cb445974084-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0ea464d5-2ade-47ca-a61f-6cb445974084" (UID: "0ea464d5-2ade-47ca-a61f-6cb445974084"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 13:19:01 crc kubenswrapper[4724]: I1002 13:19:01.990411 4724 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0ea464d5-2ade-47ca-a61f-6cb445974084-config-data" (OuterVolumeSpecName: "config-data") pod "0ea464d5-2ade-47ca-a61f-6cb445974084" (UID: "0ea464d5-2ade-47ca-a61f-6cb445974084"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 13:19:02 crc kubenswrapper[4724]: I1002 13:19:02.053723 4724 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0ea464d5-2ade-47ca-a61f-6cb445974084-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 13:19:02 crc kubenswrapper[4724]: I1002 13:19:02.053765 4724 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0ea464d5-2ade-47ca-a61f-6cb445974084-config-data\") on node \"crc\" DevicePath \"\"" Oct 02 13:19:02 crc kubenswrapper[4724]: I1002 13:19:02.053780 4724 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/0ea464d5-2ade-47ca-a61f-6cb445974084-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Oct 02 13:19:02 crc kubenswrapper[4724]: I1002 13:19:02.053792 4724 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8hsfm\" (UniqueName: \"kubernetes.io/projected/0ea464d5-2ade-47ca-a61f-6cb445974084-kube-api-access-8hsfm\") on node \"crc\" DevicePath \"\"" Oct 02 13:19:02 crc kubenswrapper[4724]: I1002 13:19:02.644772 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-db-sync-m27n9" event={"ID":"0ea464d5-2ade-47ca-a61f-6cb445974084","Type":"ContainerDied","Data":"05dfd31c6b317a5c0fe6f118ff860648776cbe8af25b43dc5ee4746d36eb0a43"} Oct 02 13:19:02 crc kubenswrapper[4724]: I1002 13:19:02.644824 4724 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="05dfd31c6b317a5c0fe6f118ff860648776cbe8af25b43dc5ee4746d36eb0a43" Oct 02 13:19:02 crc kubenswrapper[4724]: I1002 13:19:02.644863 4724 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-db-sync-m27n9" Oct 02 13:19:02 crc kubenswrapper[4724]: I1002 13:19:02.955387 4724 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/glance-default-single-0"] Oct 02 13:19:02 crc kubenswrapper[4724]: E1002 13:19:02.956252 4724 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0ea464d5-2ade-47ca-a61f-6cb445974084" containerName="glance-db-sync" Oct 02 13:19:02 crc kubenswrapper[4724]: I1002 13:19:02.956269 4724 state_mem.go:107] "Deleted CPUSet assignment" podUID="0ea464d5-2ade-47ca-a61f-6cb445974084" containerName="glance-db-sync" Oct 02 13:19:02 crc kubenswrapper[4724]: I1002 13:19:02.956455 4724 memory_manager.go:354] "RemoveStaleState removing state" podUID="0ea464d5-2ade-47ca-a61f-6cb445974084" containerName="glance-db-sync" Oct 02 13:19:02 crc kubenswrapper[4724]: I1002 13:19:02.957561 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-single-0" Oct 02 13:19:02 crc kubenswrapper[4724]: I1002 13:19:02.962628 4724 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"combined-ca-bundle" Oct 02 13:19:02 crc kubenswrapper[4724]: I1002 13:19:02.964185 4724 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"cert-glance-default-public-svc" Oct 02 13:19:02 crc kubenswrapper[4724]: I1002 13:19:02.964343 4724 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"glance-default-single-config-data" Oct 02 13:19:02 crc kubenswrapper[4724]: I1002 13:19:02.964676 4724 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"glance-glance-dockercfg-5t7ct" Oct 02 13:19:02 crc kubenswrapper[4724]: I1002 13:19:02.964684 4724 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"cert-glance-default-internal-svc" Oct 02 13:19:02 crc kubenswrapper[4724]: I1002 13:19:02.970386 4724 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"glance-scripts" Oct 02 13:19:02 crc kubenswrapper[4724]: I1002 13:19:02.974633 4724 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-default-single-0"] Oct 02 13:19:02 crc kubenswrapper[4724]: I1002 13:19:02.991831 4724 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-default-single-0"] Oct 02 13:19:02 crc kubenswrapper[4724]: E1002 13:19:02.992387 4724 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[combined-ca-bundle config-data glance httpd-run internal-tls-certs kube-api-access-9gzkn logs public-tls-certs scripts], unattached volumes=[], failed to process volumes=[combined-ca-bundle config-data glance httpd-run internal-tls-certs kube-api-access-9gzkn logs public-tls-certs scripts]: context canceled" pod="glance-kuttl-tests/glance-default-single-0" podUID="dc0f2817-6e3a-4c08-b667-056ce15d89bb" Oct 02 13:19:03 crc kubenswrapper[4724]: I1002 13:19:03.068210 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/dc0f2817-6e3a-4c08-b667-056ce15d89bb-public-tls-certs\") pod \"glance-default-single-0\" (UID: \"dc0f2817-6e3a-4c08-b667-056ce15d89bb\") " pod="glance-kuttl-tests/glance-default-single-0" Oct 02 13:19:03 crc kubenswrapper[4724]: I1002 13:19:03.068259 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9gzkn\" (UniqueName: \"kubernetes.io/projected/dc0f2817-6e3a-4c08-b667-056ce15d89bb-kube-api-access-9gzkn\") pod \"glance-default-single-0\" (UID: \"dc0f2817-6e3a-4c08-b667-056ce15d89bb\") " pod="glance-kuttl-tests/glance-default-single-0" Oct 02 13:19:03 crc kubenswrapper[4724]: I1002 13:19:03.068291 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dc0f2817-6e3a-4c08-b667-056ce15d89bb-scripts\") pod \"glance-default-single-0\" (UID: \"dc0f2817-6e3a-4c08-b667-056ce15d89bb\") " pod="glance-kuttl-tests/glance-default-single-0" Oct 02 13:19:03 crc kubenswrapper[4724]: I1002 13:19:03.068312 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc0f2817-6e3a-4c08-b667-056ce15d89bb-combined-ca-bundle\") pod \"glance-default-single-0\" (UID: \"dc0f2817-6e3a-4c08-b667-056ce15d89bb\") " pod="glance-kuttl-tests/glance-default-single-0" Oct 02 13:19:03 crc kubenswrapper[4724]: I1002 13:19:03.068483 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dc0f2817-6e3a-4c08-b667-056ce15d89bb-config-data\") pod \"glance-default-single-0\" (UID: \"dc0f2817-6e3a-4c08-b667-056ce15d89bb\") " pod="glance-kuttl-tests/glance-default-single-0" Oct 02 13:19:03 crc kubenswrapper[4724]: I1002 13:19:03.068662 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dc0f2817-6e3a-4c08-b667-056ce15d89bb-logs\") pod \"glance-default-single-0\" (UID: \"dc0f2817-6e3a-4c08-b667-056ce15d89bb\") " pod="glance-kuttl-tests/glance-default-single-0" Oct 02 13:19:03 crc kubenswrapper[4724]: I1002 13:19:03.068723 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/dc0f2817-6e3a-4c08-b667-056ce15d89bb-httpd-run\") pod \"glance-default-single-0\" (UID: \"dc0f2817-6e3a-4c08-b667-056ce15d89bb\") " pod="glance-kuttl-tests/glance-default-single-0" Oct 02 13:19:03 crc kubenswrapper[4724]: I1002 13:19:03.068780 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/dc0f2817-6e3a-4c08-b667-056ce15d89bb-internal-tls-certs\") pod \"glance-default-single-0\" (UID: \"dc0f2817-6e3a-4c08-b667-056ce15d89bb\") " pod="glance-kuttl-tests/glance-default-single-0" Oct 02 13:19:03 crc kubenswrapper[4724]: I1002 13:19:03.068818 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-single-0\" (UID: \"dc0f2817-6e3a-4c08-b667-056ce15d89bb\") " pod="glance-kuttl-tests/glance-default-single-0" Oct 02 13:19:03 crc kubenswrapper[4724]: I1002 13:19:03.170647 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/dc0f2817-6e3a-4c08-b667-056ce15d89bb-public-tls-certs\") pod \"glance-default-single-0\" (UID: \"dc0f2817-6e3a-4c08-b667-056ce15d89bb\") " pod="glance-kuttl-tests/glance-default-single-0" Oct 02 13:19:03 crc kubenswrapper[4724]: I1002 13:19:03.170700 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9gzkn\" (UniqueName: \"kubernetes.io/projected/dc0f2817-6e3a-4c08-b667-056ce15d89bb-kube-api-access-9gzkn\") pod \"glance-default-single-0\" (UID: \"dc0f2817-6e3a-4c08-b667-056ce15d89bb\") " pod="glance-kuttl-tests/glance-default-single-0" Oct 02 13:19:03 crc kubenswrapper[4724]: I1002 13:19:03.170733 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dc0f2817-6e3a-4c08-b667-056ce15d89bb-scripts\") pod \"glance-default-single-0\" (UID: \"dc0f2817-6e3a-4c08-b667-056ce15d89bb\") " pod="glance-kuttl-tests/glance-default-single-0" Oct 02 13:19:03 crc kubenswrapper[4724]: I1002 13:19:03.170757 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc0f2817-6e3a-4c08-b667-056ce15d89bb-combined-ca-bundle\") pod \"glance-default-single-0\" (UID: \"dc0f2817-6e3a-4c08-b667-056ce15d89bb\") " pod="glance-kuttl-tests/glance-default-single-0" Oct 02 13:19:03 crc kubenswrapper[4724]: I1002 13:19:03.170785 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dc0f2817-6e3a-4c08-b667-056ce15d89bb-config-data\") pod \"glance-default-single-0\" (UID: \"dc0f2817-6e3a-4c08-b667-056ce15d89bb\") " pod="glance-kuttl-tests/glance-default-single-0" Oct 02 13:19:03 crc kubenswrapper[4724]: I1002 13:19:03.170827 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dc0f2817-6e3a-4c08-b667-056ce15d89bb-logs\") pod \"glance-default-single-0\" (UID: \"dc0f2817-6e3a-4c08-b667-056ce15d89bb\") " pod="glance-kuttl-tests/glance-default-single-0" Oct 02 13:19:03 crc kubenswrapper[4724]: I1002 13:19:03.170851 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/dc0f2817-6e3a-4c08-b667-056ce15d89bb-httpd-run\") pod \"glance-default-single-0\" (UID: \"dc0f2817-6e3a-4c08-b667-056ce15d89bb\") " pod="glance-kuttl-tests/glance-default-single-0" Oct 02 13:19:03 crc kubenswrapper[4724]: I1002 13:19:03.170875 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/dc0f2817-6e3a-4c08-b667-056ce15d89bb-internal-tls-certs\") pod \"glance-default-single-0\" (UID: \"dc0f2817-6e3a-4c08-b667-056ce15d89bb\") " pod="glance-kuttl-tests/glance-default-single-0" Oct 02 13:19:03 crc kubenswrapper[4724]: I1002 13:19:03.170895 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-single-0\" (UID: \"dc0f2817-6e3a-4c08-b667-056ce15d89bb\") " pod="glance-kuttl-tests/glance-default-single-0" Oct 02 13:19:03 crc kubenswrapper[4724]: I1002 13:19:03.171283 4724 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-single-0\" (UID: \"dc0f2817-6e3a-4c08-b667-056ce15d89bb\") device mount path \"/mnt/openstack/pv06\"" pod="glance-kuttl-tests/glance-default-single-0" Oct 02 13:19:03 crc kubenswrapper[4724]: I1002 13:19:03.172206 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/dc0f2817-6e3a-4c08-b667-056ce15d89bb-httpd-run\") pod \"glance-default-single-0\" (UID: \"dc0f2817-6e3a-4c08-b667-056ce15d89bb\") " pod="glance-kuttl-tests/glance-default-single-0" Oct 02 13:19:03 crc kubenswrapper[4724]: I1002 13:19:03.172918 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dc0f2817-6e3a-4c08-b667-056ce15d89bb-logs\") pod \"glance-default-single-0\" (UID: \"dc0f2817-6e3a-4c08-b667-056ce15d89bb\") " pod="glance-kuttl-tests/glance-default-single-0" Oct 02 13:19:03 crc kubenswrapper[4724]: I1002 13:19:03.175793 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/dc0f2817-6e3a-4c08-b667-056ce15d89bb-public-tls-certs\") pod \"glance-default-single-0\" (UID: \"dc0f2817-6e3a-4c08-b667-056ce15d89bb\") " pod="glance-kuttl-tests/glance-default-single-0" Oct 02 13:19:03 crc kubenswrapper[4724]: I1002 13:19:03.176998 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dc0f2817-6e3a-4c08-b667-056ce15d89bb-config-data\") pod \"glance-default-single-0\" (UID: \"dc0f2817-6e3a-4c08-b667-056ce15d89bb\") " pod="glance-kuttl-tests/glance-default-single-0" Oct 02 13:19:03 crc kubenswrapper[4724]: I1002 13:19:03.178055 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc0f2817-6e3a-4c08-b667-056ce15d89bb-combined-ca-bundle\") pod \"glance-default-single-0\" (UID: \"dc0f2817-6e3a-4c08-b667-056ce15d89bb\") " pod="glance-kuttl-tests/glance-default-single-0" Oct 02 13:19:03 crc kubenswrapper[4724]: I1002 13:19:03.184023 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/dc0f2817-6e3a-4c08-b667-056ce15d89bb-internal-tls-certs\") pod \"glance-default-single-0\" (UID: \"dc0f2817-6e3a-4c08-b667-056ce15d89bb\") " pod="glance-kuttl-tests/glance-default-single-0" Oct 02 13:19:03 crc kubenswrapper[4724]: I1002 13:19:03.184805 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dc0f2817-6e3a-4c08-b667-056ce15d89bb-scripts\") pod \"glance-default-single-0\" (UID: \"dc0f2817-6e3a-4c08-b667-056ce15d89bb\") " pod="glance-kuttl-tests/glance-default-single-0" Oct 02 13:19:03 crc kubenswrapper[4724]: I1002 13:19:03.189961 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9gzkn\" (UniqueName: \"kubernetes.io/projected/dc0f2817-6e3a-4c08-b667-056ce15d89bb-kube-api-access-9gzkn\") pod \"glance-default-single-0\" (UID: \"dc0f2817-6e3a-4c08-b667-056ce15d89bb\") " pod="glance-kuttl-tests/glance-default-single-0" Oct 02 13:19:03 crc kubenswrapper[4724]: I1002 13:19:03.198930 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-single-0\" (UID: \"dc0f2817-6e3a-4c08-b667-056ce15d89bb\") " pod="glance-kuttl-tests/glance-default-single-0" Oct 02 13:19:03 crc kubenswrapper[4724]: I1002 13:19:03.651319 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-single-0" Oct 02 13:19:03 crc kubenswrapper[4724]: I1002 13:19:03.662694 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-single-0" Oct 02 13:19:03 crc kubenswrapper[4724]: I1002 13:19:03.679030 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"dc0f2817-6e3a-4c08-b667-056ce15d89bb\" (UID: \"dc0f2817-6e3a-4c08-b667-056ce15d89bb\") " Oct 02 13:19:03 crc kubenswrapper[4724]: I1002 13:19:03.679111 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/dc0f2817-6e3a-4c08-b667-056ce15d89bb-internal-tls-certs\") pod \"dc0f2817-6e3a-4c08-b667-056ce15d89bb\" (UID: \"dc0f2817-6e3a-4c08-b667-056ce15d89bb\") " Oct 02 13:19:03 crc kubenswrapper[4724]: I1002 13:19:03.679153 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dc0f2817-6e3a-4c08-b667-056ce15d89bb-config-data\") pod \"dc0f2817-6e3a-4c08-b667-056ce15d89bb\" (UID: \"dc0f2817-6e3a-4c08-b667-056ce15d89bb\") " Oct 02 13:19:03 crc kubenswrapper[4724]: I1002 13:19:03.679177 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dc0f2817-6e3a-4c08-b667-056ce15d89bb-scripts\") pod \"dc0f2817-6e3a-4c08-b667-056ce15d89bb\" (UID: \"dc0f2817-6e3a-4c08-b667-056ce15d89bb\") " Oct 02 13:19:03 crc kubenswrapper[4724]: I1002 13:19:03.679204 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc0f2817-6e3a-4c08-b667-056ce15d89bb-combined-ca-bundle\") pod \"dc0f2817-6e3a-4c08-b667-056ce15d89bb\" (UID: \"dc0f2817-6e3a-4c08-b667-056ce15d89bb\") " Oct 02 13:19:03 crc kubenswrapper[4724]: I1002 13:19:03.679250 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/dc0f2817-6e3a-4c08-b667-056ce15d89bb-public-tls-certs\") pod \"dc0f2817-6e3a-4c08-b667-056ce15d89bb\" (UID: \"dc0f2817-6e3a-4c08-b667-056ce15d89bb\") " Oct 02 13:19:03 crc kubenswrapper[4724]: I1002 13:19:03.679285 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9gzkn\" (UniqueName: \"kubernetes.io/projected/dc0f2817-6e3a-4c08-b667-056ce15d89bb-kube-api-access-9gzkn\") pod \"dc0f2817-6e3a-4c08-b667-056ce15d89bb\" (UID: \"dc0f2817-6e3a-4c08-b667-056ce15d89bb\") " Oct 02 13:19:03 crc kubenswrapper[4724]: I1002 13:19:03.679443 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/dc0f2817-6e3a-4c08-b667-056ce15d89bb-httpd-run\") pod \"dc0f2817-6e3a-4c08-b667-056ce15d89bb\" (UID: \"dc0f2817-6e3a-4c08-b667-056ce15d89bb\") " Oct 02 13:19:03 crc kubenswrapper[4724]: I1002 13:19:03.679483 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dc0f2817-6e3a-4c08-b667-056ce15d89bb-logs\") pod \"dc0f2817-6e3a-4c08-b667-056ce15d89bb\" (UID: \"dc0f2817-6e3a-4c08-b667-056ce15d89bb\") " Oct 02 13:19:03 crc kubenswrapper[4724]: I1002 13:19:03.680001 4724 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dc0f2817-6e3a-4c08-b667-056ce15d89bb-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "dc0f2817-6e3a-4c08-b667-056ce15d89bb" (UID: "dc0f2817-6e3a-4c08-b667-056ce15d89bb"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 13:19:03 crc kubenswrapper[4724]: I1002 13:19:03.680241 4724 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dc0f2817-6e3a-4c08-b667-056ce15d89bb-logs" (OuterVolumeSpecName: "logs") pod "dc0f2817-6e3a-4c08-b667-056ce15d89bb" (UID: "dc0f2817-6e3a-4c08-b667-056ce15d89bb"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 13:19:03 crc kubenswrapper[4724]: I1002 13:19:03.680491 4724 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dc0f2817-6e3a-4c08-b667-056ce15d89bb-logs\") on node \"crc\" DevicePath \"\"" Oct 02 13:19:03 crc kubenswrapper[4724]: I1002 13:19:03.680514 4724 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/dc0f2817-6e3a-4c08-b667-056ce15d89bb-httpd-run\") on node \"crc\" DevicePath \"\"" Oct 02 13:19:03 crc kubenswrapper[4724]: I1002 13:19:03.682291 4724 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage06-crc" (OuterVolumeSpecName: "glance") pod "dc0f2817-6e3a-4c08-b667-056ce15d89bb" (UID: "dc0f2817-6e3a-4c08-b667-056ce15d89bb"). InnerVolumeSpecName "local-storage06-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 02 13:19:03 crc kubenswrapper[4724]: I1002 13:19:03.682879 4724 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dc0f2817-6e3a-4c08-b667-056ce15d89bb-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "dc0f2817-6e3a-4c08-b667-056ce15d89bb" (UID: "dc0f2817-6e3a-4c08-b667-056ce15d89bb"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 13:19:03 crc kubenswrapper[4724]: I1002 13:19:03.683119 4724 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dc0f2817-6e3a-4c08-b667-056ce15d89bb-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "dc0f2817-6e3a-4c08-b667-056ce15d89bb" (UID: "dc0f2817-6e3a-4c08-b667-056ce15d89bb"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 13:19:03 crc kubenswrapper[4724]: I1002 13:19:03.683732 4724 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dc0f2817-6e3a-4c08-b667-056ce15d89bb-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "dc0f2817-6e3a-4c08-b667-056ce15d89bb" (UID: "dc0f2817-6e3a-4c08-b667-056ce15d89bb"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 13:19:03 crc kubenswrapper[4724]: I1002 13:19:03.684010 4724 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dc0f2817-6e3a-4c08-b667-056ce15d89bb-scripts" (OuterVolumeSpecName: "scripts") pod "dc0f2817-6e3a-4c08-b667-056ce15d89bb" (UID: "dc0f2817-6e3a-4c08-b667-056ce15d89bb"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 13:19:03 crc kubenswrapper[4724]: I1002 13:19:03.684371 4724 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dc0f2817-6e3a-4c08-b667-056ce15d89bb-kube-api-access-9gzkn" (OuterVolumeSpecName: "kube-api-access-9gzkn") pod "dc0f2817-6e3a-4c08-b667-056ce15d89bb" (UID: "dc0f2817-6e3a-4c08-b667-056ce15d89bb"). InnerVolumeSpecName "kube-api-access-9gzkn". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 13:19:03 crc kubenswrapper[4724]: I1002 13:19:03.691844 4724 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dc0f2817-6e3a-4c08-b667-056ce15d89bb-config-data" (OuterVolumeSpecName: "config-data") pod "dc0f2817-6e3a-4c08-b667-056ce15d89bb" (UID: "dc0f2817-6e3a-4c08-b667-056ce15d89bb"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 13:19:03 crc kubenswrapper[4724]: I1002 13:19:03.781887 4724 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" " Oct 02 13:19:03 crc kubenswrapper[4724]: I1002 13:19:03.781926 4724 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/dc0f2817-6e3a-4c08-b667-056ce15d89bb-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 02 13:19:03 crc kubenswrapper[4724]: I1002 13:19:03.781937 4724 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dc0f2817-6e3a-4c08-b667-056ce15d89bb-config-data\") on node \"crc\" DevicePath \"\"" Oct 02 13:19:03 crc kubenswrapper[4724]: I1002 13:19:03.781947 4724 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dc0f2817-6e3a-4c08-b667-056ce15d89bb-scripts\") on node \"crc\" DevicePath \"\"" Oct 02 13:19:03 crc kubenswrapper[4724]: I1002 13:19:03.781956 4724 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc0f2817-6e3a-4c08-b667-056ce15d89bb-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 13:19:03 crc kubenswrapper[4724]: I1002 13:19:03.781964 4724 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/dc0f2817-6e3a-4c08-b667-056ce15d89bb-public-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 02 13:19:03 crc kubenswrapper[4724]: I1002 13:19:03.781973 4724 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9gzkn\" (UniqueName: \"kubernetes.io/projected/dc0f2817-6e3a-4c08-b667-056ce15d89bb-kube-api-access-9gzkn\") on node \"crc\" DevicePath \"\"" Oct 02 13:19:03 crc kubenswrapper[4724]: I1002 13:19:03.794691 4724 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage06-crc" (UniqueName: "kubernetes.io/local-volume/local-storage06-crc") on node "crc" Oct 02 13:19:03 crc kubenswrapper[4724]: I1002 13:19:03.884431 4724 reconciler_common.go:293] "Volume detached for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" DevicePath \"\"" Oct 02 13:19:04 crc kubenswrapper[4724]: I1002 13:19:04.658813 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-single-0" Oct 02 13:19:04 crc kubenswrapper[4724]: I1002 13:19:04.699072 4724 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-default-single-0"] Oct 02 13:19:04 crc kubenswrapper[4724]: I1002 13:19:04.714396 4724 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/glance-default-single-0"] Oct 02 13:19:04 crc kubenswrapper[4724]: I1002 13:19:04.734479 4724 patch_prober.go:28] interesting pod/machine-config-daemon-74k4t container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 02 13:19:04 crc kubenswrapper[4724]: I1002 13:19:04.734553 4724 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-74k4t" podUID="f6090eaa-c182-4788-950c-16352c271233" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 02 13:19:04 crc kubenswrapper[4724]: I1002 13:19:04.734607 4724 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-74k4t" Oct 02 13:19:04 crc kubenswrapper[4724]: I1002 13:19:04.735312 4724 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"d6edbaca1be551c79f462bf303a060c3a2f4d99fd2847faa868ec902caa0b3e8"} pod="openshift-machine-config-operator/machine-config-daemon-74k4t" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 02 13:19:04 crc kubenswrapper[4724]: I1002 13:19:04.735364 4724 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-74k4t" podUID="f6090eaa-c182-4788-950c-16352c271233" containerName="machine-config-daemon" containerID="cri-o://d6edbaca1be551c79f462bf303a060c3a2f4d99fd2847faa868ec902caa0b3e8" gracePeriod=600 Oct 02 13:19:05 crc kubenswrapper[4724]: I1002 13:19:05.668992 4724 generic.go:334] "Generic (PLEG): container finished" podID="f6090eaa-c182-4788-950c-16352c271233" containerID="d6edbaca1be551c79f462bf303a060c3a2f4d99fd2847faa868ec902caa0b3e8" exitCode=0 Oct 02 13:19:05 crc kubenswrapper[4724]: I1002 13:19:05.669067 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-74k4t" event={"ID":"f6090eaa-c182-4788-950c-16352c271233","Type":"ContainerDied","Data":"d6edbaca1be551c79f462bf303a060c3a2f4d99fd2847faa868ec902caa0b3e8"} Oct 02 13:19:05 crc kubenswrapper[4724]: I1002 13:19:05.669300 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-74k4t" event={"ID":"f6090eaa-c182-4788-950c-16352c271233","Type":"ContainerStarted","Data":"6ce034e637cd5db41f6c20bcc63e0101e74c2ad03481f5a6d5b4f08ea38e8992"} Oct 02 13:19:05 crc kubenswrapper[4724]: I1002 13:19:05.669327 4724 scope.go:117] "RemoveContainer" containerID="0a68cb5a6d61b6854f57fe6390e5dda2f41ea0bca0a949b5592be96925084795" Oct 02 13:19:06 crc kubenswrapper[4724]: I1002 13:19:06.324045 4724 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dc0f2817-6e3a-4c08-b667-056ce15d89bb" path="/var/lib/kubelet/pods/dc0f2817-6e3a-4c08-b667-056ce15d89bb/volumes" Oct 02 13:19:09 crc kubenswrapper[4724]: I1002 13:19:09.603847 4724 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/glance-default-single-0"] Oct 02 13:19:09 crc kubenswrapper[4724]: I1002 13:19:09.607087 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-single-0" Oct 02 13:19:09 crc kubenswrapper[4724]: I1002 13:19:09.610197 4724 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"cert-glance-default-public-svc" Oct 02 13:19:09 crc kubenswrapper[4724]: I1002 13:19:09.610378 4724 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"cert-glance-default-internal-svc" Oct 02 13:19:09 crc kubenswrapper[4724]: I1002 13:19:09.610421 4724 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"combined-ca-bundle" Oct 02 13:19:09 crc kubenswrapper[4724]: I1002 13:19:09.610660 4724 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"glance-glance-dockercfg-5t7ct" Oct 02 13:19:09 crc kubenswrapper[4724]: I1002 13:19:09.612972 4724 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"glance-scripts" Oct 02 13:19:09 crc kubenswrapper[4724]: I1002 13:19:09.614459 4724 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"glance-default-single-config-data" Oct 02 13:19:09 crc kubenswrapper[4724]: I1002 13:19:09.628985 4724 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-default-single-0"] Oct 02 13:19:09 crc kubenswrapper[4724]: I1002 13:19:09.670056 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/aecd86fe-b69b-4be6-945c-893da8bd9ca7-scripts\") pod \"glance-default-single-0\" (UID: \"aecd86fe-b69b-4be6-945c-893da8bd9ca7\") " pod="glance-kuttl-tests/glance-default-single-0" Oct 02 13:19:09 crc kubenswrapper[4724]: I1002 13:19:09.670135 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/aecd86fe-b69b-4be6-945c-893da8bd9ca7-internal-tls-certs\") pod \"glance-default-single-0\" (UID: \"aecd86fe-b69b-4be6-945c-893da8bd9ca7\") " pod="glance-kuttl-tests/glance-default-single-0" Oct 02 13:19:09 crc kubenswrapper[4724]: I1002 13:19:09.670229 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/aecd86fe-b69b-4be6-945c-893da8bd9ca7-httpd-run\") pod \"glance-default-single-0\" (UID: \"aecd86fe-b69b-4be6-945c-893da8bd9ca7\") " pod="glance-kuttl-tests/glance-default-single-0" Oct 02 13:19:09 crc kubenswrapper[4724]: I1002 13:19:09.670308 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/aecd86fe-b69b-4be6-945c-893da8bd9ca7-public-tls-certs\") pod \"glance-default-single-0\" (UID: \"aecd86fe-b69b-4be6-945c-893da8bd9ca7\") " pod="glance-kuttl-tests/glance-default-single-0" Oct 02 13:19:09 crc kubenswrapper[4724]: I1002 13:19:09.670366 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6s2pd\" (UniqueName: \"kubernetes.io/projected/aecd86fe-b69b-4be6-945c-893da8bd9ca7-kube-api-access-6s2pd\") pod \"glance-default-single-0\" (UID: \"aecd86fe-b69b-4be6-945c-893da8bd9ca7\") " pod="glance-kuttl-tests/glance-default-single-0" Oct 02 13:19:09 crc kubenswrapper[4724]: I1002 13:19:09.670413 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aecd86fe-b69b-4be6-945c-893da8bd9ca7-combined-ca-bundle\") pod \"glance-default-single-0\" (UID: \"aecd86fe-b69b-4be6-945c-893da8bd9ca7\") " pod="glance-kuttl-tests/glance-default-single-0" Oct 02 13:19:09 crc kubenswrapper[4724]: I1002 13:19:09.670452 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aecd86fe-b69b-4be6-945c-893da8bd9ca7-config-data\") pod \"glance-default-single-0\" (UID: \"aecd86fe-b69b-4be6-945c-893da8bd9ca7\") " pod="glance-kuttl-tests/glance-default-single-0" Oct 02 13:19:09 crc kubenswrapper[4724]: I1002 13:19:09.670493 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-single-0\" (UID: \"aecd86fe-b69b-4be6-945c-893da8bd9ca7\") " pod="glance-kuttl-tests/glance-default-single-0" Oct 02 13:19:09 crc kubenswrapper[4724]: I1002 13:19:09.670528 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/aecd86fe-b69b-4be6-945c-893da8bd9ca7-logs\") pod \"glance-default-single-0\" (UID: \"aecd86fe-b69b-4be6-945c-893da8bd9ca7\") " pod="glance-kuttl-tests/glance-default-single-0" Oct 02 13:19:09 crc kubenswrapper[4724]: I1002 13:19:09.773908 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/aecd86fe-b69b-4be6-945c-893da8bd9ca7-httpd-run\") pod \"glance-default-single-0\" (UID: \"aecd86fe-b69b-4be6-945c-893da8bd9ca7\") " pod="glance-kuttl-tests/glance-default-single-0" Oct 02 13:19:09 crc kubenswrapper[4724]: I1002 13:19:09.774006 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/aecd86fe-b69b-4be6-945c-893da8bd9ca7-public-tls-certs\") pod \"glance-default-single-0\" (UID: \"aecd86fe-b69b-4be6-945c-893da8bd9ca7\") " pod="glance-kuttl-tests/glance-default-single-0" Oct 02 13:19:09 crc kubenswrapper[4724]: I1002 13:19:09.774047 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6s2pd\" (UniqueName: \"kubernetes.io/projected/aecd86fe-b69b-4be6-945c-893da8bd9ca7-kube-api-access-6s2pd\") pod \"glance-default-single-0\" (UID: \"aecd86fe-b69b-4be6-945c-893da8bd9ca7\") " pod="glance-kuttl-tests/glance-default-single-0" Oct 02 13:19:09 crc kubenswrapper[4724]: I1002 13:19:09.774085 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aecd86fe-b69b-4be6-945c-893da8bd9ca7-combined-ca-bundle\") pod \"glance-default-single-0\" (UID: \"aecd86fe-b69b-4be6-945c-893da8bd9ca7\") " pod="glance-kuttl-tests/glance-default-single-0" Oct 02 13:19:09 crc kubenswrapper[4724]: I1002 13:19:09.774150 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aecd86fe-b69b-4be6-945c-893da8bd9ca7-config-data\") pod \"glance-default-single-0\" (UID: \"aecd86fe-b69b-4be6-945c-893da8bd9ca7\") " pod="glance-kuttl-tests/glance-default-single-0" Oct 02 13:19:09 crc kubenswrapper[4724]: I1002 13:19:09.774193 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-single-0\" (UID: \"aecd86fe-b69b-4be6-945c-893da8bd9ca7\") " pod="glance-kuttl-tests/glance-default-single-0" Oct 02 13:19:09 crc kubenswrapper[4724]: I1002 13:19:09.774227 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/aecd86fe-b69b-4be6-945c-893da8bd9ca7-logs\") pod \"glance-default-single-0\" (UID: \"aecd86fe-b69b-4be6-945c-893da8bd9ca7\") " pod="glance-kuttl-tests/glance-default-single-0" Oct 02 13:19:09 crc kubenswrapper[4724]: I1002 13:19:09.774323 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/aecd86fe-b69b-4be6-945c-893da8bd9ca7-scripts\") pod \"glance-default-single-0\" (UID: \"aecd86fe-b69b-4be6-945c-893da8bd9ca7\") " pod="glance-kuttl-tests/glance-default-single-0" Oct 02 13:19:09 crc kubenswrapper[4724]: I1002 13:19:09.774353 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/aecd86fe-b69b-4be6-945c-893da8bd9ca7-internal-tls-certs\") pod \"glance-default-single-0\" (UID: \"aecd86fe-b69b-4be6-945c-893da8bd9ca7\") " pod="glance-kuttl-tests/glance-default-single-0" Oct 02 13:19:09 crc kubenswrapper[4724]: I1002 13:19:09.775287 4724 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-single-0\" (UID: \"aecd86fe-b69b-4be6-945c-893da8bd9ca7\") device mount path \"/mnt/openstack/pv06\"" pod="glance-kuttl-tests/glance-default-single-0" Oct 02 13:19:09 crc kubenswrapper[4724]: I1002 13:19:09.775984 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/aecd86fe-b69b-4be6-945c-893da8bd9ca7-httpd-run\") pod \"glance-default-single-0\" (UID: \"aecd86fe-b69b-4be6-945c-893da8bd9ca7\") " pod="glance-kuttl-tests/glance-default-single-0" Oct 02 13:19:09 crc kubenswrapper[4724]: I1002 13:19:09.776611 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/aecd86fe-b69b-4be6-945c-893da8bd9ca7-logs\") pod \"glance-default-single-0\" (UID: \"aecd86fe-b69b-4be6-945c-893da8bd9ca7\") " pod="glance-kuttl-tests/glance-default-single-0" Oct 02 13:19:09 crc kubenswrapper[4724]: I1002 13:19:09.784058 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/aecd86fe-b69b-4be6-945c-893da8bd9ca7-scripts\") pod \"glance-default-single-0\" (UID: \"aecd86fe-b69b-4be6-945c-893da8bd9ca7\") " pod="glance-kuttl-tests/glance-default-single-0" Oct 02 13:19:09 crc kubenswrapper[4724]: I1002 13:19:09.785749 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aecd86fe-b69b-4be6-945c-893da8bd9ca7-config-data\") pod \"glance-default-single-0\" (UID: \"aecd86fe-b69b-4be6-945c-893da8bd9ca7\") " pod="glance-kuttl-tests/glance-default-single-0" Oct 02 13:19:09 crc kubenswrapper[4724]: I1002 13:19:09.786493 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aecd86fe-b69b-4be6-945c-893da8bd9ca7-combined-ca-bundle\") pod \"glance-default-single-0\" (UID: \"aecd86fe-b69b-4be6-945c-893da8bd9ca7\") " pod="glance-kuttl-tests/glance-default-single-0" Oct 02 13:19:09 crc kubenswrapper[4724]: I1002 13:19:09.787480 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/aecd86fe-b69b-4be6-945c-893da8bd9ca7-internal-tls-certs\") pod \"glance-default-single-0\" (UID: \"aecd86fe-b69b-4be6-945c-893da8bd9ca7\") " pod="glance-kuttl-tests/glance-default-single-0" Oct 02 13:19:09 crc kubenswrapper[4724]: I1002 13:19:09.788992 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/aecd86fe-b69b-4be6-945c-893da8bd9ca7-public-tls-certs\") pod \"glance-default-single-0\" (UID: \"aecd86fe-b69b-4be6-945c-893da8bd9ca7\") " pod="glance-kuttl-tests/glance-default-single-0" Oct 02 13:19:09 crc kubenswrapper[4724]: I1002 13:19:09.795844 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6s2pd\" (UniqueName: \"kubernetes.io/projected/aecd86fe-b69b-4be6-945c-893da8bd9ca7-kube-api-access-6s2pd\") pod \"glance-default-single-0\" (UID: \"aecd86fe-b69b-4be6-945c-893da8bd9ca7\") " pod="glance-kuttl-tests/glance-default-single-0" Oct 02 13:19:09 crc kubenswrapper[4724]: I1002 13:19:09.807062 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-single-0\" (UID: \"aecd86fe-b69b-4be6-945c-893da8bd9ca7\") " pod="glance-kuttl-tests/glance-default-single-0" Oct 02 13:19:09 crc kubenswrapper[4724]: I1002 13:19:09.982909 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-single-0" Oct 02 13:19:10 crc kubenswrapper[4724]: I1002 13:19:10.417789 4724 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-default-single-0"] Oct 02 13:19:10 crc kubenswrapper[4724]: I1002 13:19:10.711576 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-0" event={"ID":"aecd86fe-b69b-4be6-945c-893da8bd9ca7","Type":"ContainerStarted","Data":"1d429099a14afd4c6f90fedd3a4a4f622c242857c1bf48ad752d839079746ba1"} Oct 02 13:19:11 crc kubenswrapper[4724]: I1002 13:19:11.731688 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-0" event={"ID":"aecd86fe-b69b-4be6-945c-893da8bd9ca7","Type":"ContainerStarted","Data":"32d57e51b7e1b52541565ec2ca8bb82072045bcbf16f98f5ca7efb348aa342fb"} Oct 02 13:19:11 crc kubenswrapper[4724]: I1002 13:19:11.732127 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-0" event={"ID":"aecd86fe-b69b-4be6-945c-893da8bd9ca7","Type":"ContainerStarted","Data":"0a21b87856e326a7c8100cf1dfe1212e8b66ad850443fca96a3279cb3b4105ba"} Oct 02 13:19:19 crc kubenswrapper[4724]: I1002 13:19:19.983709 4724 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="glance-kuttl-tests/glance-default-single-0" Oct 02 13:19:19 crc kubenswrapper[4724]: I1002 13:19:19.984911 4724 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="glance-kuttl-tests/glance-default-single-0" Oct 02 13:19:20 crc kubenswrapper[4724]: I1002 13:19:20.022752 4724 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="glance-kuttl-tests/glance-default-single-0" Oct 02 13:19:20 crc kubenswrapper[4724]: I1002 13:19:20.033714 4724 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="glance-kuttl-tests/glance-default-single-0" Oct 02 13:19:20 crc kubenswrapper[4724]: I1002 13:19:20.053782 4724 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="glance-kuttl-tests/glance-default-single-0" podStartSLOduration=11.053507675 podStartE2EDuration="11.053507675s" podCreationTimestamp="2025-10-02 13:19:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 13:19:11.761939405 +0000 UTC m=+1216.216698526" watchObservedRunningTime="2025-10-02 13:19:20.053507675 +0000 UTC m=+1224.508266806" Oct 02 13:19:20 crc kubenswrapper[4724]: I1002 13:19:20.802759 4724 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/glance-default-single-0" Oct 02 13:19:20 crc kubenswrapper[4724]: I1002 13:19:20.803217 4724 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/glance-default-single-0" Oct 02 13:19:22 crc kubenswrapper[4724]: I1002 13:19:22.917351 4724 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/glance-default-single-0" Oct 02 13:19:22 crc kubenswrapper[4724]: I1002 13:19:22.918081 4724 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 02 13:19:22 crc kubenswrapper[4724]: I1002 13:19:22.921380 4724 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/glance-default-single-0" Oct 02 13:19:23 crc kubenswrapper[4724]: I1002 13:19:23.860861 4724 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-db-sync-m27n9"] Oct 02 13:19:23 crc kubenswrapper[4724]: I1002 13:19:23.867808 4724 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/glance-db-sync-m27n9"] Oct 02 13:19:23 crc kubenswrapper[4724]: I1002 13:19:23.902283 4724 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/glance3a8a-account-delete-dbzm7"] Oct 02 13:19:23 crc kubenswrapper[4724]: I1002 13:19:23.903849 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance3a8a-account-delete-dbzm7" Oct 02 13:19:23 crc kubenswrapper[4724]: E1002 13:19:23.936027 4724 secret.go:188] Couldn't get secret glance-kuttl-tests/glance-scripts: secret "glance-scripts" not found Oct 02 13:19:23 crc kubenswrapper[4724]: E1002 13:19:23.936100 4724 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/aecd86fe-b69b-4be6-945c-893da8bd9ca7-scripts podName:aecd86fe-b69b-4be6-945c-893da8bd9ca7 nodeName:}" failed. No retries permitted until 2025-10-02 13:19:24.436079555 +0000 UTC m=+1228.890838676 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "scripts" (UniqueName: "kubernetes.io/secret/aecd86fe-b69b-4be6-945c-893da8bd9ca7-scripts") pod "glance-default-single-0" (UID: "aecd86fe-b69b-4be6-945c-893da8bd9ca7") : secret "glance-scripts" not found Oct 02 13:19:23 crc kubenswrapper[4724]: I1002 13:19:23.963100 4724 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance3a8a-account-delete-dbzm7"] Oct 02 13:19:23 crc kubenswrapper[4724]: I1002 13:19:23.983044 4724 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-default-single-0"] Oct 02 13:19:24 crc kubenswrapper[4724]: I1002 13:19:24.036443 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ndpjh\" (UniqueName: \"kubernetes.io/projected/940298e8-728b-4860-8680-1c0de9b5cdb4-kube-api-access-ndpjh\") pod \"glance3a8a-account-delete-dbzm7\" (UID: \"940298e8-728b-4860-8680-1c0de9b5cdb4\") " pod="glance-kuttl-tests/glance3a8a-account-delete-dbzm7" Oct 02 13:19:24 crc kubenswrapper[4724]: I1002 13:19:24.138043 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ndpjh\" (UniqueName: \"kubernetes.io/projected/940298e8-728b-4860-8680-1c0de9b5cdb4-kube-api-access-ndpjh\") pod \"glance3a8a-account-delete-dbzm7\" (UID: \"940298e8-728b-4860-8680-1c0de9b5cdb4\") " pod="glance-kuttl-tests/glance3a8a-account-delete-dbzm7" Oct 02 13:19:24 crc kubenswrapper[4724]: I1002 13:19:24.161049 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ndpjh\" (UniqueName: \"kubernetes.io/projected/940298e8-728b-4860-8680-1c0de9b5cdb4-kube-api-access-ndpjh\") pod \"glance3a8a-account-delete-dbzm7\" (UID: \"940298e8-728b-4860-8680-1c0de9b5cdb4\") " pod="glance-kuttl-tests/glance3a8a-account-delete-dbzm7" Oct 02 13:19:24 crc kubenswrapper[4724]: I1002 13:19:24.277010 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance3a8a-account-delete-dbzm7" Oct 02 13:19:24 crc kubenswrapper[4724]: I1002 13:19:24.323750 4724 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0ea464d5-2ade-47ca-a61f-6cb445974084" path="/var/lib/kubelet/pods/0ea464d5-2ade-47ca-a61f-6cb445974084/volumes" Oct 02 13:19:24 crc kubenswrapper[4724]: E1002 13:19:24.442723 4724 secret.go:188] Couldn't get secret glance-kuttl-tests/glance-scripts: secret "glance-scripts" not found Oct 02 13:19:24 crc kubenswrapper[4724]: E1002 13:19:24.443068 4724 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/aecd86fe-b69b-4be6-945c-893da8bd9ca7-scripts podName:aecd86fe-b69b-4be6-945c-893da8bd9ca7 nodeName:}" failed. No retries permitted until 2025-10-02 13:19:25.443049898 +0000 UTC m=+1229.897809029 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "scripts" (UniqueName: "kubernetes.io/secret/aecd86fe-b69b-4be6-945c-893da8bd9ca7-scripts") pod "glance-default-single-0" (UID: "aecd86fe-b69b-4be6-945c-893da8bd9ca7") : secret "glance-scripts" not found Oct 02 13:19:24 crc kubenswrapper[4724]: I1002 13:19:24.786377 4724 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance3a8a-account-delete-dbzm7"] Oct 02 13:19:24 crc kubenswrapper[4724]: I1002 13:19:24.837112 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance3a8a-account-delete-dbzm7" event={"ID":"940298e8-728b-4860-8680-1c0de9b5cdb4","Type":"ContainerStarted","Data":"0014b9ab98100c6cf07be720bf3e7ebb8edde9906a5b815accfd4a6727c89ab0"} Oct 02 13:19:24 crc kubenswrapper[4724]: I1002 13:19:24.837714 4724 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-single-0" podUID="aecd86fe-b69b-4be6-945c-893da8bd9ca7" containerName="glance-log" containerID="cri-o://0a21b87856e326a7c8100cf1dfe1212e8b66ad850443fca96a3279cb3b4105ba" gracePeriod=30 Oct 02 13:19:24 crc kubenswrapper[4724]: I1002 13:19:24.837786 4724 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-single-0" podUID="aecd86fe-b69b-4be6-945c-893da8bd9ca7" containerName="glance-httpd" containerID="cri-o://32d57e51b7e1b52541565ec2ca8bb82072045bcbf16f98f5ca7efb348aa342fb" gracePeriod=30 Oct 02 13:19:24 crc kubenswrapper[4724]: I1002 13:19:24.852341 4724 prober.go:107] "Probe failed" probeType="Readiness" pod="glance-kuttl-tests/glance-default-single-0" podUID="aecd86fe-b69b-4be6-945c-893da8bd9ca7" containerName="glance-httpd" probeResult="failure" output="Get \"https://10.217.0.112:9292/healthcheck\": EOF" Oct 02 13:19:25 crc kubenswrapper[4724]: E1002 13:19:25.459213 4724 secret.go:188] Couldn't get secret glance-kuttl-tests/glance-scripts: secret "glance-scripts" not found Oct 02 13:19:25 crc kubenswrapper[4724]: E1002 13:19:25.459707 4724 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/aecd86fe-b69b-4be6-945c-893da8bd9ca7-scripts podName:aecd86fe-b69b-4be6-945c-893da8bd9ca7 nodeName:}" failed. No retries permitted until 2025-10-02 13:19:27.459683942 +0000 UTC m=+1231.914443063 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "scripts" (UniqueName: "kubernetes.io/secret/aecd86fe-b69b-4be6-945c-893da8bd9ca7-scripts") pod "glance-default-single-0" (UID: "aecd86fe-b69b-4be6-945c-893da8bd9ca7") : secret "glance-scripts" not found Oct 02 13:19:25 crc kubenswrapper[4724]: I1002 13:19:25.846775 4724 generic.go:334] "Generic (PLEG): container finished" podID="940298e8-728b-4860-8680-1c0de9b5cdb4" containerID="ab7229977a329097a0842af112f62d676f4c303e70b2dc7b43f9396323489818" exitCode=0 Oct 02 13:19:25 crc kubenswrapper[4724]: I1002 13:19:25.846890 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance3a8a-account-delete-dbzm7" event={"ID":"940298e8-728b-4860-8680-1c0de9b5cdb4","Type":"ContainerDied","Data":"ab7229977a329097a0842af112f62d676f4c303e70b2dc7b43f9396323489818"} Oct 02 13:19:25 crc kubenswrapper[4724]: I1002 13:19:25.848609 4724 generic.go:334] "Generic (PLEG): container finished" podID="aecd86fe-b69b-4be6-945c-893da8bd9ca7" containerID="0a21b87856e326a7c8100cf1dfe1212e8b66ad850443fca96a3279cb3b4105ba" exitCode=143 Oct 02 13:19:25 crc kubenswrapper[4724]: I1002 13:19:25.848724 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-0" event={"ID":"aecd86fe-b69b-4be6-945c-893da8bd9ca7","Type":"ContainerDied","Data":"0a21b87856e326a7c8100cf1dfe1212e8b66ad850443fca96a3279cb3b4105ba"} Oct 02 13:19:27 crc kubenswrapper[4724]: I1002 13:19:27.170471 4724 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance3a8a-account-delete-dbzm7" Oct 02 13:19:27 crc kubenswrapper[4724]: I1002 13:19:27.319454 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ndpjh\" (UniqueName: \"kubernetes.io/projected/940298e8-728b-4860-8680-1c0de9b5cdb4-kube-api-access-ndpjh\") pod \"940298e8-728b-4860-8680-1c0de9b5cdb4\" (UID: \"940298e8-728b-4860-8680-1c0de9b5cdb4\") " Oct 02 13:19:27 crc kubenswrapper[4724]: I1002 13:19:27.327138 4724 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/940298e8-728b-4860-8680-1c0de9b5cdb4-kube-api-access-ndpjh" (OuterVolumeSpecName: "kube-api-access-ndpjh") pod "940298e8-728b-4860-8680-1c0de9b5cdb4" (UID: "940298e8-728b-4860-8680-1c0de9b5cdb4"). InnerVolumeSpecName "kube-api-access-ndpjh". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 13:19:27 crc kubenswrapper[4724]: I1002 13:19:27.422490 4724 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ndpjh\" (UniqueName: \"kubernetes.io/projected/940298e8-728b-4860-8680-1c0de9b5cdb4-kube-api-access-ndpjh\") on node \"crc\" DevicePath \"\"" Oct 02 13:19:27 crc kubenswrapper[4724]: E1002 13:19:27.524605 4724 secret.go:188] Couldn't get secret glance-kuttl-tests/glance-scripts: secret "glance-scripts" not found Oct 02 13:19:27 crc kubenswrapper[4724]: E1002 13:19:27.524692 4724 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/aecd86fe-b69b-4be6-945c-893da8bd9ca7-scripts podName:aecd86fe-b69b-4be6-945c-893da8bd9ca7 nodeName:}" failed. No retries permitted until 2025-10-02 13:19:31.524670676 +0000 UTC m=+1235.979429797 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "scripts" (UniqueName: "kubernetes.io/secret/aecd86fe-b69b-4be6-945c-893da8bd9ca7-scripts") pod "glance-default-single-0" (UID: "aecd86fe-b69b-4be6-945c-893da8bd9ca7") : secret "glance-scripts" not found Oct 02 13:19:27 crc kubenswrapper[4724]: I1002 13:19:27.864629 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance3a8a-account-delete-dbzm7" event={"ID":"940298e8-728b-4860-8680-1c0de9b5cdb4","Type":"ContainerDied","Data":"0014b9ab98100c6cf07be720bf3e7ebb8edde9906a5b815accfd4a6727c89ab0"} Oct 02 13:19:27 crc kubenswrapper[4724]: I1002 13:19:27.864678 4724 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0014b9ab98100c6cf07be720bf3e7ebb8edde9906a5b815accfd4a6727c89ab0" Oct 02 13:19:27 crc kubenswrapper[4724]: I1002 13:19:27.864774 4724 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance3a8a-account-delete-dbzm7" Oct 02 13:19:28 crc kubenswrapper[4724]: I1002 13:19:28.667971 4724 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-single-0" Oct 02 13:19:28 crc kubenswrapper[4724]: I1002 13:19:28.749134 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6s2pd\" (UniqueName: \"kubernetes.io/projected/aecd86fe-b69b-4be6-945c-893da8bd9ca7-kube-api-access-6s2pd\") pod \"aecd86fe-b69b-4be6-945c-893da8bd9ca7\" (UID: \"aecd86fe-b69b-4be6-945c-893da8bd9ca7\") " Oct 02 13:19:28 crc kubenswrapper[4724]: I1002 13:19:28.749200 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/aecd86fe-b69b-4be6-945c-893da8bd9ca7-logs\") pod \"aecd86fe-b69b-4be6-945c-893da8bd9ca7\" (UID: \"aecd86fe-b69b-4be6-945c-893da8bd9ca7\") " Oct 02 13:19:28 crc kubenswrapper[4724]: I1002 13:19:28.749296 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"aecd86fe-b69b-4be6-945c-893da8bd9ca7\" (UID: \"aecd86fe-b69b-4be6-945c-893da8bd9ca7\") " Oct 02 13:19:28 crc kubenswrapper[4724]: I1002 13:19:28.749344 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/aecd86fe-b69b-4be6-945c-893da8bd9ca7-scripts\") pod \"aecd86fe-b69b-4be6-945c-893da8bd9ca7\" (UID: \"aecd86fe-b69b-4be6-945c-893da8bd9ca7\") " Oct 02 13:19:28 crc kubenswrapper[4724]: I1002 13:19:28.749362 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/aecd86fe-b69b-4be6-945c-893da8bd9ca7-internal-tls-certs\") pod \"aecd86fe-b69b-4be6-945c-893da8bd9ca7\" (UID: \"aecd86fe-b69b-4be6-945c-893da8bd9ca7\") " Oct 02 13:19:28 crc kubenswrapper[4724]: I1002 13:19:28.749394 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aecd86fe-b69b-4be6-945c-893da8bd9ca7-combined-ca-bundle\") pod \"aecd86fe-b69b-4be6-945c-893da8bd9ca7\" (UID: \"aecd86fe-b69b-4be6-945c-893da8bd9ca7\") " Oct 02 13:19:28 crc kubenswrapper[4724]: I1002 13:19:28.749423 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aecd86fe-b69b-4be6-945c-893da8bd9ca7-config-data\") pod \"aecd86fe-b69b-4be6-945c-893da8bd9ca7\" (UID: \"aecd86fe-b69b-4be6-945c-893da8bd9ca7\") " Oct 02 13:19:28 crc kubenswrapper[4724]: I1002 13:19:28.749438 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/aecd86fe-b69b-4be6-945c-893da8bd9ca7-public-tls-certs\") pod \"aecd86fe-b69b-4be6-945c-893da8bd9ca7\" (UID: \"aecd86fe-b69b-4be6-945c-893da8bd9ca7\") " Oct 02 13:19:28 crc kubenswrapper[4724]: I1002 13:19:28.749458 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/aecd86fe-b69b-4be6-945c-893da8bd9ca7-httpd-run\") pod \"aecd86fe-b69b-4be6-945c-893da8bd9ca7\" (UID: \"aecd86fe-b69b-4be6-945c-893da8bd9ca7\") " Oct 02 13:19:28 crc kubenswrapper[4724]: I1002 13:19:28.750252 4724 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/aecd86fe-b69b-4be6-945c-893da8bd9ca7-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "aecd86fe-b69b-4be6-945c-893da8bd9ca7" (UID: "aecd86fe-b69b-4be6-945c-893da8bd9ca7"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 13:19:28 crc kubenswrapper[4724]: I1002 13:19:28.751860 4724 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/aecd86fe-b69b-4be6-945c-893da8bd9ca7-logs" (OuterVolumeSpecName: "logs") pod "aecd86fe-b69b-4be6-945c-893da8bd9ca7" (UID: "aecd86fe-b69b-4be6-945c-893da8bd9ca7"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 13:19:28 crc kubenswrapper[4724]: I1002 13:19:28.754833 4724 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aecd86fe-b69b-4be6-945c-893da8bd9ca7-kube-api-access-6s2pd" (OuterVolumeSpecName: "kube-api-access-6s2pd") pod "aecd86fe-b69b-4be6-945c-893da8bd9ca7" (UID: "aecd86fe-b69b-4be6-945c-893da8bd9ca7"). InnerVolumeSpecName "kube-api-access-6s2pd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 13:19:28 crc kubenswrapper[4724]: I1002 13:19:28.756563 4724 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aecd86fe-b69b-4be6-945c-893da8bd9ca7-scripts" (OuterVolumeSpecName: "scripts") pod "aecd86fe-b69b-4be6-945c-893da8bd9ca7" (UID: "aecd86fe-b69b-4be6-945c-893da8bd9ca7"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 13:19:28 crc kubenswrapper[4724]: I1002 13:19:28.756674 4724 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage06-crc" (OuterVolumeSpecName: "glance") pod "aecd86fe-b69b-4be6-945c-893da8bd9ca7" (UID: "aecd86fe-b69b-4be6-945c-893da8bd9ca7"). InnerVolumeSpecName "local-storage06-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 02 13:19:28 crc kubenswrapper[4724]: I1002 13:19:28.774759 4724 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aecd86fe-b69b-4be6-945c-893da8bd9ca7-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "aecd86fe-b69b-4be6-945c-893da8bd9ca7" (UID: "aecd86fe-b69b-4be6-945c-893da8bd9ca7"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 13:19:28 crc kubenswrapper[4724]: I1002 13:19:28.798830 4724 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aecd86fe-b69b-4be6-945c-893da8bd9ca7-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "aecd86fe-b69b-4be6-945c-893da8bd9ca7" (UID: "aecd86fe-b69b-4be6-945c-893da8bd9ca7"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 13:19:28 crc kubenswrapper[4724]: I1002 13:19:28.807063 4724 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aecd86fe-b69b-4be6-945c-893da8bd9ca7-config-data" (OuterVolumeSpecName: "config-data") pod "aecd86fe-b69b-4be6-945c-893da8bd9ca7" (UID: "aecd86fe-b69b-4be6-945c-893da8bd9ca7"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 13:19:28 crc kubenswrapper[4724]: I1002 13:19:28.809985 4724 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aecd86fe-b69b-4be6-945c-893da8bd9ca7-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "aecd86fe-b69b-4be6-945c-893da8bd9ca7" (UID: "aecd86fe-b69b-4be6-945c-893da8bd9ca7"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 13:19:28 crc kubenswrapper[4724]: I1002 13:19:28.851198 4724 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6s2pd\" (UniqueName: \"kubernetes.io/projected/aecd86fe-b69b-4be6-945c-893da8bd9ca7-kube-api-access-6s2pd\") on node \"crc\" DevicePath \"\"" Oct 02 13:19:28 crc kubenswrapper[4724]: I1002 13:19:28.851232 4724 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/aecd86fe-b69b-4be6-945c-893da8bd9ca7-logs\") on node \"crc\" DevicePath \"\"" Oct 02 13:19:28 crc kubenswrapper[4724]: I1002 13:19:28.851272 4724 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" " Oct 02 13:19:28 crc kubenswrapper[4724]: I1002 13:19:28.851285 4724 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/aecd86fe-b69b-4be6-945c-893da8bd9ca7-scripts\") on node \"crc\" DevicePath \"\"" Oct 02 13:19:28 crc kubenswrapper[4724]: I1002 13:19:28.851297 4724 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/aecd86fe-b69b-4be6-945c-893da8bd9ca7-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 02 13:19:28 crc kubenswrapper[4724]: I1002 13:19:28.851307 4724 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aecd86fe-b69b-4be6-945c-893da8bd9ca7-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 13:19:28 crc kubenswrapper[4724]: I1002 13:19:28.851317 4724 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/aecd86fe-b69b-4be6-945c-893da8bd9ca7-public-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 02 13:19:28 crc kubenswrapper[4724]: I1002 13:19:28.851327 4724 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aecd86fe-b69b-4be6-945c-893da8bd9ca7-config-data\") on node \"crc\" DevicePath \"\"" Oct 02 13:19:28 crc kubenswrapper[4724]: I1002 13:19:28.851335 4724 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/aecd86fe-b69b-4be6-945c-893da8bd9ca7-httpd-run\") on node \"crc\" DevicePath \"\"" Oct 02 13:19:28 crc kubenswrapper[4724]: I1002 13:19:28.870762 4724 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage06-crc" (UniqueName: "kubernetes.io/local-volume/local-storage06-crc") on node "crc" Oct 02 13:19:28 crc kubenswrapper[4724]: I1002 13:19:28.872890 4724 generic.go:334] "Generic (PLEG): container finished" podID="aecd86fe-b69b-4be6-945c-893da8bd9ca7" containerID="32d57e51b7e1b52541565ec2ca8bb82072045bcbf16f98f5ca7efb348aa342fb" exitCode=0 Oct 02 13:19:28 crc kubenswrapper[4724]: I1002 13:19:28.872935 4724 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-single-0" Oct 02 13:19:28 crc kubenswrapper[4724]: I1002 13:19:28.872938 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-0" event={"ID":"aecd86fe-b69b-4be6-945c-893da8bd9ca7","Type":"ContainerDied","Data":"32d57e51b7e1b52541565ec2ca8bb82072045bcbf16f98f5ca7efb348aa342fb"} Oct 02 13:19:28 crc kubenswrapper[4724]: I1002 13:19:28.872973 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-0" event={"ID":"aecd86fe-b69b-4be6-945c-893da8bd9ca7","Type":"ContainerDied","Data":"1d429099a14afd4c6f90fedd3a4a4f622c242857c1bf48ad752d839079746ba1"} Oct 02 13:19:28 crc kubenswrapper[4724]: I1002 13:19:28.872989 4724 scope.go:117] "RemoveContainer" containerID="32d57e51b7e1b52541565ec2ca8bb82072045bcbf16f98f5ca7efb348aa342fb" Oct 02 13:19:28 crc kubenswrapper[4724]: I1002 13:19:28.892719 4724 scope.go:117] "RemoveContainer" containerID="0a21b87856e326a7c8100cf1dfe1212e8b66ad850443fca96a3279cb3b4105ba" Oct 02 13:19:28 crc kubenswrapper[4724]: I1002 13:19:28.901603 4724 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-default-single-0"] Oct 02 13:19:28 crc kubenswrapper[4724]: I1002 13:19:28.907037 4724 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/glance-default-single-0"] Oct 02 13:19:28 crc kubenswrapper[4724]: I1002 13:19:28.914077 4724 scope.go:117] "RemoveContainer" containerID="32d57e51b7e1b52541565ec2ca8bb82072045bcbf16f98f5ca7efb348aa342fb" Oct 02 13:19:28 crc kubenswrapper[4724]: E1002 13:19:28.918716 4724 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"32d57e51b7e1b52541565ec2ca8bb82072045bcbf16f98f5ca7efb348aa342fb\": container with ID starting with 32d57e51b7e1b52541565ec2ca8bb82072045bcbf16f98f5ca7efb348aa342fb not found: ID does not exist" containerID="32d57e51b7e1b52541565ec2ca8bb82072045bcbf16f98f5ca7efb348aa342fb" Oct 02 13:19:28 crc kubenswrapper[4724]: I1002 13:19:28.918775 4724 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"32d57e51b7e1b52541565ec2ca8bb82072045bcbf16f98f5ca7efb348aa342fb"} err="failed to get container status \"32d57e51b7e1b52541565ec2ca8bb82072045bcbf16f98f5ca7efb348aa342fb\": rpc error: code = NotFound desc = could not find container \"32d57e51b7e1b52541565ec2ca8bb82072045bcbf16f98f5ca7efb348aa342fb\": container with ID starting with 32d57e51b7e1b52541565ec2ca8bb82072045bcbf16f98f5ca7efb348aa342fb not found: ID does not exist" Oct 02 13:19:28 crc kubenswrapper[4724]: I1002 13:19:28.918804 4724 scope.go:117] "RemoveContainer" containerID="0a21b87856e326a7c8100cf1dfe1212e8b66ad850443fca96a3279cb3b4105ba" Oct 02 13:19:28 crc kubenswrapper[4724]: E1002 13:19:28.919403 4724 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0a21b87856e326a7c8100cf1dfe1212e8b66ad850443fca96a3279cb3b4105ba\": container with ID starting with 0a21b87856e326a7c8100cf1dfe1212e8b66ad850443fca96a3279cb3b4105ba not found: ID does not exist" containerID="0a21b87856e326a7c8100cf1dfe1212e8b66ad850443fca96a3279cb3b4105ba" Oct 02 13:19:28 crc kubenswrapper[4724]: I1002 13:19:28.919426 4724 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0a21b87856e326a7c8100cf1dfe1212e8b66ad850443fca96a3279cb3b4105ba"} err="failed to get container status \"0a21b87856e326a7c8100cf1dfe1212e8b66ad850443fca96a3279cb3b4105ba\": rpc error: code = NotFound desc = could not find container \"0a21b87856e326a7c8100cf1dfe1212e8b66ad850443fca96a3279cb3b4105ba\": container with ID starting with 0a21b87856e326a7c8100cf1dfe1212e8b66ad850443fca96a3279cb3b4105ba not found: ID does not exist" Oct 02 13:19:28 crc kubenswrapper[4724]: I1002 13:19:28.935280 4724 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-db-create-q9h5p"] Oct 02 13:19:28 crc kubenswrapper[4724]: I1002 13:19:28.941377 4724 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/glance-db-create-q9h5p"] Oct 02 13:19:28 crc kubenswrapper[4724]: I1002 13:19:28.947050 4724 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-3a8a-account-create-kzm2c"] Oct 02 13:19:28 crc kubenswrapper[4724]: I1002 13:19:28.951609 4724 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance3a8a-account-delete-dbzm7"] Oct 02 13:19:28 crc kubenswrapper[4724]: I1002 13:19:28.952519 4724 reconciler_common.go:293] "Volume detached for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" DevicePath \"\"" Oct 02 13:19:28 crc kubenswrapper[4724]: I1002 13:19:28.958037 4724 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/glance3a8a-account-delete-dbzm7"] Oct 02 13:19:28 crc kubenswrapper[4724]: I1002 13:19:28.962242 4724 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/glance-3a8a-account-create-kzm2c"] Oct 02 13:19:29 crc kubenswrapper[4724]: I1002 13:19:29.812703 4724 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/glance-db-create-jbs2p"] Oct 02 13:19:29 crc kubenswrapper[4724]: E1002 13:19:29.813090 4724 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aecd86fe-b69b-4be6-945c-893da8bd9ca7" containerName="glance-httpd" Oct 02 13:19:29 crc kubenswrapper[4724]: I1002 13:19:29.813110 4724 state_mem.go:107] "Deleted CPUSet assignment" podUID="aecd86fe-b69b-4be6-945c-893da8bd9ca7" containerName="glance-httpd" Oct 02 13:19:29 crc kubenswrapper[4724]: E1002 13:19:29.813129 4724 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aecd86fe-b69b-4be6-945c-893da8bd9ca7" containerName="glance-log" Oct 02 13:19:29 crc kubenswrapper[4724]: I1002 13:19:29.813137 4724 state_mem.go:107] "Deleted CPUSet assignment" podUID="aecd86fe-b69b-4be6-945c-893da8bd9ca7" containerName="glance-log" Oct 02 13:19:29 crc kubenswrapper[4724]: E1002 13:19:29.813156 4724 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="940298e8-728b-4860-8680-1c0de9b5cdb4" containerName="mariadb-account-delete" Oct 02 13:19:29 crc kubenswrapper[4724]: I1002 13:19:29.813164 4724 state_mem.go:107] "Deleted CPUSet assignment" podUID="940298e8-728b-4860-8680-1c0de9b5cdb4" containerName="mariadb-account-delete" Oct 02 13:19:29 crc kubenswrapper[4724]: I1002 13:19:29.813315 4724 memory_manager.go:354] "RemoveStaleState removing state" podUID="940298e8-728b-4860-8680-1c0de9b5cdb4" containerName="mariadb-account-delete" Oct 02 13:19:29 crc kubenswrapper[4724]: I1002 13:19:29.813330 4724 memory_manager.go:354] "RemoveStaleState removing state" podUID="aecd86fe-b69b-4be6-945c-893da8bd9ca7" containerName="glance-log" Oct 02 13:19:29 crc kubenswrapper[4724]: I1002 13:19:29.813346 4724 memory_manager.go:354] "RemoveStaleState removing state" podUID="aecd86fe-b69b-4be6-945c-893da8bd9ca7" containerName="glance-httpd" Oct 02 13:19:29 crc kubenswrapper[4724]: I1002 13:19:29.813972 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-db-create-jbs2p" Oct 02 13:19:29 crc kubenswrapper[4724]: I1002 13:19:29.818730 4724 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-db-create-jbs2p"] Oct 02 13:19:29 crc kubenswrapper[4724]: I1002 13:19:29.969474 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zf78p\" (UniqueName: \"kubernetes.io/projected/6aaa371f-e098-479e-a2f5-a90f58ab1e57-kube-api-access-zf78p\") pod \"glance-db-create-jbs2p\" (UID: \"6aaa371f-e098-479e-a2f5-a90f58ab1e57\") " pod="glance-kuttl-tests/glance-db-create-jbs2p" Oct 02 13:19:30 crc kubenswrapper[4724]: I1002 13:19:30.071530 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zf78p\" (UniqueName: \"kubernetes.io/projected/6aaa371f-e098-479e-a2f5-a90f58ab1e57-kube-api-access-zf78p\") pod \"glance-db-create-jbs2p\" (UID: \"6aaa371f-e098-479e-a2f5-a90f58ab1e57\") " pod="glance-kuttl-tests/glance-db-create-jbs2p" Oct 02 13:19:30 crc kubenswrapper[4724]: I1002 13:19:30.095213 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zf78p\" (UniqueName: \"kubernetes.io/projected/6aaa371f-e098-479e-a2f5-a90f58ab1e57-kube-api-access-zf78p\") pod \"glance-db-create-jbs2p\" (UID: \"6aaa371f-e098-479e-a2f5-a90f58ab1e57\") " pod="glance-kuttl-tests/glance-db-create-jbs2p" Oct 02 13:19:30 crc kubenswrapper[4724]: I1002 13:19:30.184141 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-db-create-jbs2p" Oct 02 13:19:30 crc kubenswrapper[4724]: I1002 13:19:30.324286 4724 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6fb35493-0f64-406c-9997-3c0a5a47c8bb" path="/var/lib/kubelet/pods/6fb35493-0f64-406c-9997-3c0a5a47c8bb/volumes" Oct 02 13:19:30 crc kubenswrapper[4724]: I1002 13:19:30.325261 4724 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="940298e8-728b-4860-8680-1c0de9b5cdb4" path="/var/lib/kubelet/pods/940298e8-728b-4860-8680-1c0de9b5cdb4/volumes" Oct 02 13:19:30 crc kubenswrapper[4724]: I1002 13:19:30.325949 4724 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="aecd86fe-b69b-4be6-945c-893da8bd9ca7" path="/var/lib/kubelet/pods/aecd86fe-b69b-4be6-945c-893da8bd9ca7/volumes" Oct 02 13:19:30 crc kubenswrapper[4724]: I1002 13:19:30.327493 4724 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d769764e-3687-4760-94eb-3516b6dbbaa1" path="/var/lib/kubelet/pods/d769764e-3687-4760-94eb-3516b6dbbaa1/volumes" Oct 02 13:19:30 crc kubenswrapper[4724]: W1002 13:19:30.611118 4724 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6aaa371f_e098_479e_a2f5_a90f58ab1e57.slice/crio-b3783b89ae731b9ab392ed92b51ed767122dc5e4ad2674aae246bb4cc3973c15 WatchSource:0}: Error finding container b3783b89ae731b9ab392ed92b51ed767122dc5e4ad2674aae246bb4cc3973c15: Status 404 returned error can't find the container with id b3783b89ae731b9ab392ed92b51ed767122dc5e4ad2674aae246bb4cc3973c15 Oct 02 13:19:30 crc kubenswrapper[4724]: I1002 13:19:30.611353 4724 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-db-create-jbs2p"] Oct 02 13:19:30 crc kubenswrapper[4724]: I1002 13:19:30.892222 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-db-create-jbs2p" event={"ID":"6aaa371f-e098-479e-a2f5-a90f58ab1e57","Type":"ContainerStarted","Data":"b3783b89ae731b9ab392ed92b51ed767122dc5e4ad2674aae246bb4cc3973c15"} Oct 02 13:19:31 crc kubenswrapper[4724]: I1002 13:19:31.901247 4724 generic.go:334] "Generic (PLEG): container finished" podID="6aaa371f-e098-479e-a2f5-a90f58ab1e57" containerID="63b398025309fdd40adac91eb83dc0fccf1d1314ad7e354307fc0f2a7cce76ba" exitCode=0 Oct 02 13:19:31 crc kubenswrapper[4724]: I1002 13:19:31.901341 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-db-create-jbs2p" event={"ID":"6aaa371f-e098-479e-a2f5-a90f58ab1e57","Type":"ContainerDied","Data":"63b398025309fdd40adac91eb83dc0fccf1d1314ad7e354307fc0f2a7cce76ba"} Oct 02 13:19:33 crc kubenswrapper[4724]: I1002 13:19:33.245136 4724 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-db-create-jbs2p" Oct 02 13:19:33 crc kubenswrapper[4724]: I1002 13:19:33.327355 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zf78p\" (UniqueName: \"kubernetes.io/projected/6aaa371f-e098-479e-a2f5-a90f58ab1e57-kube-api-access-zf78p\") pod \"6aaa371f-e098-479e-a2f5-a90f58ab1e57\" (UID: \"6aaa371f-e098-479e-a2f5-a90f58ab1e57\") " Oct 02 13:19:33 crc kubenswrapper[4724]: I1002 13:19:33.332783 4724 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6aaa371f-e098-479e-a2f5-a90f58ab1e57-kube-api-access-zf78p" (OuterVolumeSpecName: "kube-api-access-zf78p") pod "6aaa371f-e098-479e-a2f5-a90f58ab1e57" (UID: "6aaa371f-e098-479e-a2f5-a90f58ab1e57"). InnerVolumeSpecName "kube-api-access-zf78p". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 13:19:33 crc kubenswrapper[4724]: I1002 13:19:33.429646 4724 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zf78p\" (UniqueName: \"kubernetes.io/projected/6aaa371f-e098-479e-a2f5-a90f58ab1e57-kube-api-access-zf78p\") on node \"crc\" DevicePath \"\"" Oct 02 13:19:33 crc kubenswrapper[4724]: I1002 13:19:33.915869 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-db-create-jbs2p" event={"ID":"6aaa371f-e098-479e-a2f5-a90f58ab1e57","Type":"ContainerDied","Data":"b3783b89ae731b9ab392ed92b51ed767122dc5e4ad2674aae246bb4cc3973c15"} Oct 02 13:19:33 crc kubenswrapper[4724]: I1002 13:19:33.915917 4724 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b3783b89ae731b9ab392ed92b51ed767122dc5e4ad2674aae246bb4cc3973c15" Oct 02 13:19:33 crc kubenswrapper[4724]: I1002 13:19:33.915953 4724 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-db-create-jbs2p" Oct 02 13:19:39 crc kubenswrapper[4724]: I1002 13:19:39.844357 4724 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/glance-cee3-account-create-pmhd5"] Oct 02 13:19:39 crc kubenswrapper[4724]: E1002 13:19:39.845912 4724 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6aaa371f-e098-479e-a2f5-a90f58ab1e57" containerName="mariadb-database-create" Oct 02 13:19:39 crc kubenswrapper[4724]: I1002 13:19:39.845938 4724 state_mem.go:107] "Deleted CPUSet assignment" podUID="6aaa371f-e098-479e-a2f5-a90f58ab1e57" containerName="mariadb-database-create" Oct 02 13:19:39 crc kubenswrapper[4724]: I1002 13:19:39.846206 4724 memory_manager.go:354] "RemoveStaleState removing state" podUID="6aaa371f-e098-479e-a2f5-a90f58ab1e57" containerName="mariadb-database-create" Oct 02 13:19:39 crc kubenswrapper[4724]: I1002 13:19:39.847172 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-cee3-account-create-pmhd5" Oct 02 13:19:39 crc kubenswrapper[4724]: I1002 13:19:39.850006 4724 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"glance-db-secret" Oct 02 13:19:39 crc kubenswrapper[4724]: I1002 13:19:39.856469 4724 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-cee3-account-create-pmhd5"] Oct 02 13:19:39 crc kubenswrapper[4724]: I1002 13:19:39.933112 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z8249\" (UniqueName: \"kubernetes.io/projected/5d5bb294-8805-4ebb-9ccb-bfa05522a8ea-kube-api-access-z8249\") pod \"glance-cee3-account-create-pmhd5\" (UID: \"5d5bb294-8805-4ebb-9ccb-bfa05522a8ea\") " pod="glance-kuttl-tests/glance-cee3-account-create-pmhd5" Oct 02 13:19:40 crc kubenswrapper[4724]: I1002 13:19:40.035096 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z8249\" (UniqueName: \"kubernetes.io/projected/5d5bb294-8805-4ebb-9ccb-bfa05522a8ea-kube-api-access-z8249\") pod \"glance-cee3-account-create-pmhd5\" (UID: \"5d5bb294-8805-4ebb-9ccb-bfa05522a8ea\") " pod="glance-kuttl-tests/glance-cee3-account-create-pmhd5" Oct 02 13:19:40 crc kubenswrapper[4724]: I1002 13:19:40.065943 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z8249\" (UniqueName: \"kubernetes.io/projected/5d5bb294-8805-4ebb-9ccb-bfa05522a8ea-kube-api-access-z8249\") pod \"glance-cee3-account-create-pmhd5\" (UID: \"5d5bb294-8805-4ebb-9ccb-bfa05522a8ea\") " pod="glance-kuttl-tests/glance-cee3-account-create-pmhd5" Oct 02 13:19:40 crc kubenswrapper[4724]: I1002 13:19:40.182490 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-cee3-account-create-pmhd5" Oct 02 13:19:40 crc kubenswrapper[4724]: I1002 13:19:40.594667 4724 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-cee3-account-create-pmhd5"] Oct 02 13:19:41 crc kubenswrapper[4724]: I1002 13:19:41.000179 4724 generic.go:334] "Generic (PLEG): container finished" podID="5d5bb294-8805-4ebb-9ccb-bfa05522a8ea" containerID="33bb4711255d18e7877afd8ef4db93bcfc508253e09d25b0fd3e486b235bf183" exitCode=0 Oct 02 13:19:41 crc kubenswrapper[4724]: I1002 13:19:41.000284 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-cee3-account-create-pmhd5" event={"ID":"5d5bb294-8805-4ebb-9ccb-bfa05522a8ea","Type":"ContainerDied","Data":"33bb4711255d18e7877afd8ef4db93bcfc508253e09d25b0fd3e486b235bf183"} Oct 02 13:19:41 crc kubenswrapper[4724]: I1002 13:19:41.000591 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-cee3-account-create-pmhd5" event={"ID":"5d5bb294-8805-4ebb-9ccb-bfa05522a8ea","Type":"ContainerStarted","Data":"791bb96930098d540e4af75894cc3508f269f9db0cc1d8dbc0fa9ca21e143d86"} Oct 02 13:19:42 crc kubenswrapper[4724]: I1002 13:19:42.333786 4724 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-cee3-account-create-pmhd5" Oct 02 13:19:42 crc kubenswrapper[4724]: I1002 13:19:42.478522 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z8249\" (UniqueName: \"kubernetes.io/projected/5d5bb294-8805-4ebb-9ccb-bfa05522a8ea-kube-api-access-z8249\") pod \"5d5bb294-8805-4ebb-9ccb-bfa05522a8ea\" (UID: \"5d5bb294-8805-4ebb-9ccb-bfa05522a8ea\") " Oct 02 13:19:42 crc kubenswrapper[4724]: I1002 13:19:42.486780 4724 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5d5bb294-8805-4ebb-9ccb-bfa05522a8ea-kube-api-access-z8249" (OuterVolumeSpecName: "kube-api-access-z8249") pod "5d5bb294-8805-4ebb-9ccb-bfa05522a8ea" (UID: "5d5bb294-8805-4ebb-9ccb-bfa05522a8ea"). InnerVolumeSpecName "kube-api-access-z8249". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 13:19:42 crc kubenswrapper[4724]: I1002 13:19:42.579974 4724 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z8249\" (UniqueName: \"kubernetes.io/projected/5d5bb294-8805-4ebb-9ccb-bfa05522a8ea-kube-api-access-z8249\") on node \"crc\" DevicePath \"\"" Oct 02 13:19:43 crc kubenswrapper[4724]: I1002 13:19:43.017458 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-cee3-account-create-pmhd5" event={"ID":"5d5bb294-8805-4ebb-9ccb-bfa05522a8ea","Type":"ContainerDied","Data":"791bb96930098d540e4af75894cc3508f269f9db0cc1d8dbc0fa9ca21e143d86"} Oct 02 13:19:43 crc kubenswrapper[4724]: I1002 13:19:43.017498 4724 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="791bb96930098d540e4af75894cc3508f269f9db0cc1d8dbc0fa9ca21e143d86" Oct 02 13:19:43 crc kubenswrapper[4724]: I1002 13:19:43.017601 4724 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-cee3-account-create-pmhd5" Oct 02 13:19:44 crc kubenswrapper[4724]: I1002 13:19:44.913555 4724 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/glance-db-sync-zsx66"] Oct 02 13:19:44 crc kubenswrapper[4724]: E1002 13:19:44.914379 4724 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5d5bb294-8805-4ebb-9ccb-bfa05522a8ea" containerName="mariadb-account-create" Oct 02 13:19:44 crc kubenswrapper[4724]: I1002 13:19:44.914400 4724 state_mem.go:107] "Deleted CPUSet assignment" podUID="5d5bb294-8805-4ebb-9ccb-bfa05522a8ea" containerName="mariadb-account-create" Oct 02 13:19:44 crc kubenswrapper[4724]: I1002 13:19:44.914583 4724 memory_manager.go:354] "RemoveStaleState removing state" podUID="5d5bb294-8805-4ebb-9ccb-bfa05522a8ea" containerName="mariadb-account-create" Oct 02 13:19:44 crc kubenswrapper[4724]: I1002 13:19:44.915226 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-db-sync-zsx66" Oct 02 13:19:44 crc kubenswrapper[4724]: I1002 13:19:44.918203 4724 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"glance-config-data" Oct 02 13:19:44 crc kubenswrapper[4724]: I1002 13:19:44.921484 4724 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"glance-glance-dockercfg-t2dm8" Oct 02 13:19:44 crc kubenswrapper[4724]: I1002 13:19:44.951879 4724 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-db-sync-zsx66"] Oct 02 13:19:45 crc kubenswrapper[4724]: I1002 13:19:45.020462 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/29911b4c-644e-4928-b3d4-be90c7009131-db-sync-config-data\") pod \"glance-db-sync-zsx66\" (UID: \"29911b4c-644e-4928-b3d4-be90c7009131\") " pod="glance-kuttl-tests/glance-db-sync-zsx66" Oct 02 13:19:45 crc kubenswrapper[4724]: I1002 13:19:45.020571 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lzh57\" (UniqueName: \"kubernetes.io/projected/29911b4c-644e-4928-b3d4-be90c7009131-kube-api-access-lzh57\") pod \"glance-db-sync-zsx66\" (UID: \"29911b4c-644e-4928-b3d4-be90c7009131\") " pod="glance-kuttl-tests/glance-db-sync-zsx66" Oct 02 13:19:45 crc kubenswrapper[4724]: I1002 13:19:45.020606 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/29911b4c-644e-4928-b3d4-be90c7009131-config-data\") pod \"glance-db-sync-zsx66\" (UID: \"29911b4c-644e-4928-b3d4-be90c7009131\") " pod="glance-kuttl-tests/glance-db-sync-zsx66" Oct 02 13:19:45 crc kubenswrapper[4724]: I1002 13:19:45.122160 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/29911b4c-644e-4928-b3d4-be90c7009131-db-sync-config-data\") pod \"glance-db-sync-zsx66\" (UID: \"29911b4c-644e-4928-b3d4-be90c7009131\") " pod="glance-kuttl-tests/glance-db-sync-zsx66" Oct 02 13:19:45 crc kubenswrapper[4724]: I1002 13:19:45.122574 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lzh57\" (UniqueName: \"kubernetes.io/projected/29911b4c-644e-4928-b3d4-be90c7009131-kube-api-access-lzh57\") pod \"glance-db-sync-zsx66\" (UID: \"29911b4c-644e-4928-b3d4-be90c7009131\") " pod="glance-kuttl-tests/glance-db-sync-zsx66" Oct 02 13:19:45 crc kubenswrapper[4724]: I1002 13:19:45.122676 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/29911b4c-644e-4928-b3d4-be90c7009131-config-data\") pod \"glance-db-sync-zsx66\" (UID: \"29911b4c-644e-4928-b3d4-be90c7009131\") " pod="glance-kuttl-tests/glance-db-sync-zsx66" Oct 02 13:19:45 crc kubenswrapper[4724]: I1002 13:19:45.127374 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/29911b4c-644e-4928-b3d4-be90c7009131-db-sync-config-data\") pod \"glance-db-sync-zsx66\" (UID: \"29911b4c-644e-4928-b3d4-be90c7009131\") " pod="glance-kuttl-tests/glance-db-sync-zsx66" Oct 02 13:19:45 crc kubenswrapper[4724]: I1002 13:19:45.136253 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/29911b4c-644e-4928-b3d4-be90c7009131-config-data\") pod \"glance-db-sync-zsx66\" (UID: \"29911b4c-644e-4928-b3d4-be90c7009131\") " pod="glance-kuttl-tests/glance-db-sync-zsx66" Oct 02 13:19:45 crc kubenswrapper[4724]: I1002 13:19:45.155163 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lzh57\" (UniqueName: \"kubernetes.io/projected/29911b4c-644e-4928-b3d4-be90c7009131-kube-api-access-lzh57\") pod \"glance-db-sync-zsx66\" (UID: \"29911b4c-644e-4928-b3d4-be90c7009131\") " pod="glance-kuttl-tests/glance-db-sync-zsx66" Oct 02 13:19:45 crc kubenswrapper[4724]: I1002 13:19:45.236562 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-db-sync-zsx66" Oct 02 13:19:45 crc kubenswrapper[4724]: I1002 13:19:45.652401 4724 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-db-sync-zsx66"] Oct 02 13:19:46 crc kubenswrapper[4724]: I1002 13:19:46.042699 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-db-sync-zsx66" event={"ID":"29911b4c-644e-4928-b3d4-be90c7009131","Type":"ContainerStarted","Data":"78fdb1bc063574875659801a35be1cdee0e023f0660b926a907336748a0296b8"} Oct 02 13:19:47 crc kubenswrapper[4724]: I1002 13:19:47.054413 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-db-sync-zsx66" event={"ID":"29911b4c-644e-4928-b3d4-be90c7009131","Type":"ContainerStarted","Data":"cb75b5491f47c29561dac07adb19990f1f5d516da1a88e9854f672df3422990d"} Oct 02 13:19:47 crc kubenswrapper[4724]: I1002 13:19:47.070727 4724 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="glance-kuttl-tests/glance-db-sync-zsx66" podStartSLOduration=3.070698764 podStartE2EDuration="3.070698764s" podCreationTimestamp="2025-10-02 13:19:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 13:19:47.069266387 +0000 UTC m=+1251.524025508" watchObservedRunningTime="2025-10-02 13:19:47.070698764 +0000 UTC m=+1251.525457915" Oct 02 13:19:50 crc kubenswrapper[4724]: I1002 13:19:50.081260 4724 generic.go:334] "Generic (PLEG): container finished" podID="29911b4c-644e-4928-b3d4-be90c7009131" containerID="cb75b5491f47c29561dac07adb19990f1f5d516da1a88e9854f672df3422990d" exitCode=0 Oct 02 13:19:50 crc kubenswrapper[4724]: I1002 13:19:50.081345 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-db-sync-zsx66" event={"ID":"29911b4c-644e-4928-b3d4-be90c7009131","Type":"ContainerDied","Data":"cb75b5491f47c29561dac07adb19990f1f5d516da1a88e9854f672df3422990d"} Oct 02 13:19:51 crc kubenswrapper[4724]: I1002 13:19:51.388687 4724 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-db-sync-zsx66" Oct 02 13:19:51 crc kubenswrapper[4724]: I1002 13:19:51.525910 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/29911b4c-644e-4928-b3d4-be90c7009131-db-sync-config-data\") pod \"29911b4c-644e-4928-b3d4-be90c7009131\" (UID: \"29911b4c-644e-4928-b3d4-be90c7009131\") " Oct 02 13:19:51 crc kubenswrapper[4724]: I1002 13:19:51.525978 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lzh57\" (UniqueName: \"kubernetes.io/projected/29911b4c-644e-4928-b3d4-be90c7009131-kube-api-access-lzh57\") pod \"29911b4c-644e-4928-b3d4-be90c7009131\" (UID: \"29911b4c-644e-4928-b3d4-be90c7009131\") " Oct 02 13:19:51 crc kubenswrapper[4724]: I1002 13:19:51.526069 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/29911b4c-644e-4928-b3d4-be90c7009131-config-data\") pod \"29911b4c-644e-4928-b3d4-be90c7009131\" (UID: \"29911b4c-644e-4928-b3d4-be90c7009131\") " Oct 02 13:19:51 crc kubenswrapper[4724]: I1002 13:19:51.532244 4724 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/29911b4c-644e-4928-b3d4-be90c7009131-kube-api-access-lzh57" (OuterVolumeSpecName: "kube-api-access-lzh57") pod "29911b4c-644e-4928-b3d4-be90c7009131" (UID: "29911b4c-644e-4928-b3d4-be90c7009131"). InnerVolumeSpecName "kube-api-access-lzh57". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 13:19:51 crc kubenswrapper[4724]: I1002 13:19:51.532641 4724 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/29911b4c-644e-4928-b3d4-be90c7009131-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "29911b4c-644e-4928-b3d4-be90c7009131" (UID: "29911b4c-644e-4928-b3d4-be90c7009131"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 13:19:51 crc kubenswrapper[4724]: I1002 13:19:51.570145 4724 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/29911b4c-644e-4928-b3d4-be90c7009131-config-data" (OuterVolumeSpecName: "config-data") pod "29911b4c-644e-4928-b3d4-be90c7009131" (UID: "29911b4c-644e-4928-b3d4-be90c7009131"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 13:19:51 crc kubenswrapper[4724]: I1002 13:19:51.627780 4724 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/29911b4c-644e-4928-b3d4-be90c7009131-config-data\") on node \"crc\" DevicePath \"\"" Oct 02 13:19:51 crc kubenswrapper[4724]: I1002 13:19:51.627859 4724 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/29911b4c-644e-4928-b3d4-be90c7009131-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Oct 02 13:19:51 crc kubenswrapper[4724]: I1002 13:19:51.627872 4724 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lzh57\" (UniqueName: \"kubernetes.io/projected/29911b4c-644e-4928-b3d4-be90c7009131-kube-api-access-lzh57\") on node \"crc\" DevicePath \"\"" Oct 02 13:19:52 crc kubenswrapper[4724]: I1002 13:19:52.100825 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-db-sync-zsx66" event={"ID":"29911b4c-644e-4928-b3d4-be90c7009131","Type":"ContainerDied","Data":"78fdb1bc063574875659801a35be1cdee0e023f0660b926a907336748a0296b8"} Oct 02 13:19:52 crc kubenswrapper[4724]: I1002 13:19:52.100879 4724 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="78fdb1bc063574875659801a35be1cdee0e023f0660b926a907336748a0296b8" Oct 02 13:19:52 crc kubenswrapper[4724]: I1002 13:19:52.100896 4724 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-db-sync-zsx66" Oct 02 13:19:53 crc kubenswrapper[4724]: I1002 13:19:53.432060 4724 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/glance-default-external-api-0"] Oct 02 13:19:53 crc kubenswrapper[4724]: E1002 13:19:53.432872 4724 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="29911b4c-644e-4928-b3d4-be90c7009131" containerName="glance-db-sync" Oct 02 13:19:53 crc kubenswrapper[4724]: I1002 13:19:53.432893 4724 state_mem.go:107] "Deleted CPUSet assignment" podUID="29911b4c-644e-4928-b3d4-be90c7009131" containerName="glance-db-sync" Oct 02 13:19:53 crc kubenswrapper[4724]: I1002 13:19:53.433109 4724 memory_manager.go:354] "RemoveStaleState removing state" podUID="29911b4c-644e-4928-b3d4-be90c7009131" containerName="glance-db-sync" Oct 02 13:19:53 crc kubenswrapper[4724]: I1002 13:19:53.434510 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-external-api-0" Oct 02 13:19:53 crc kubenswrapper[4724]: I1002 13:19:53.438600 4724 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"glance-glance-dockercfg-t2dm8" Oct 02 13:19:53 crc kubenswrapper[4724]: I1002 13:19:53.438619 4724 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"glance-scripts" Oct 02 13:19:53 crc kubenswrapper[4724]: I1002 13:19:53.443618 4724 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"glance-default-external-config-data" Oct 02 13:19:53 crc kubenswrapper[4724]: I1002 13:19:53.447919 4724 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-default-external-api-0"] Oct 02 13:19:53 crc kubenswrapper[4724]: E1002 13:19:53.454287 4724 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[config-data dev etc-iscsi etc-nvme glance glance-cache httpd-run kube-api-access-8fl44 lib-modules logs run scripts sys var-locks-brick], unattached volumes=[], failed to process volumes=[config-data dev etc-iscsi etc-nvme glance glance-cache httpd-run kube-api-access-8fl44 lib-modules logs run scripts sys var-locks-brick]: context canceled" pod="glance-kuttl-tests/glance-default-external-api-0" podUID="43274654-831e-4791-bf9f-1302c6168bbd" Oct 02 13:19:53 crc kubenswrapper[4724]: I1002 13:19:53.514085 4724 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-default-external-api-0"] Oct 02 13:19:53 crc kubenswrapper[4724]: I1002 13:19:53.537788 4724 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/glance-default-internal-api-0"] Oct 02 13:19:53 crc kubenswrapper[4724]: I1002 13:19:53.540595 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-internal-api-0" Oct 02 13:19:53 crc kubenswrapper[4724]: I1002 13:19:53.543710 4724 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"glance-default-internal-config-data" Oct 02 13:19:53 crc kubenswrapper[4724]: I1002 13:19:53.548577 4724 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-default-internal-api-0"] Oct 02 13:19:53 crc kubenswrapper[4724]: I1002 13:19:53.562679 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/43274654-831e-4791-bf9f-1302c6168bbd-etc-iscsi\") pod \"glance-default-external-api-0\" (UID: \"43274654-831e-4791-bf9f-1302c6168bbd\") " pod="glance-kuttl-tests/glance-default-external-api-0" Oct 02 13:19:53 crc kubenswrapper[4724]: I1002 13:19:53.562929 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/43274654-831e-4791-bf9f-1302c6168bbd-logs\") pod \"glance-default-external-api-0\" (UID: \"43274654-831e-4791-bf9f-1302c6168bbd\") " pod="glance-kuttl-tests/glance-default-external-api-0" Oct 02 13:19:53 crc kubenswrapper[4724]: I1002 13:19:53.563008 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-external-api-0\" (UID: \"43274654-831e-4791-bf9f-1302c6168bbd\") " pod="glance-kuttl-tests/glance-default-external-api-0" Oct 02 13:19:53 crc kubenswrapper[4724]: I1002 13:19:53.563075 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/43274654-831e-4791-bf9f-1302c6168bbd-dev\") pod \"glance-default-external-api-0\" (UID: \"43274654-831e-4791-bf9f-1302c6168bbd\") " pod="glance-kuttl-tests/glance-default-external-api-0" Oct 02 13:19:53 crc kubenswrapper[4724]: I1002 13:19:53.563146 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/43274654-831e-4791-bf9f-1302c6168bbd-run\") pod \"glance-default-external-api-0\" (UID: \"43274654-831e-4791-bf9f-1302c6168bbd\") " pod="glance-kuttl-tests/glance-default-external-api-0" Oct 02 13:19:53 crc kubenswrapper[4724]: I1002 13:19:53.563222 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-external-api-0\" (UID: \"43274654-831e-4791-bf9f-1302c6168bbd\") " pod="glance-kuttl-tests/glance-default-external-api-0" Oct 02 13:19:53 crc kubenswrapper[4724]: I1002 13:19:53.563299 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/43274654-831e-4791-bf9f-1302c6168bbd-lib-modules\") pod \"glance-default-external-api-0\" (UID: \"43274654-831e-4791-bf9f-1302c6168bbd\") " pod="glance-kuttl-tests/glance-default-external-api-0" Oct 02 13:19:53 crc kubenswrapper[4724]: I1002 13:19:53.563388 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/43274654-831e-4791-bf9f-1302c6168bbd-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"43274654-831e-4791-bf9f-1302c6168bbd\") " pod="glance-kuttl-tests/glance-default-external-api-0" Oct 02 13:19:53 crc kubenswrapper[4724]: I1002 13:19:53.563460 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8fl44\" (UniqueName: \"kubernetes.io/projected/43274654-831e-4791-bf9f-1302c6168bbd-kube-api-access-8fl44\") pod \"glance-default-external-api-0\" (UID: \"43274654-831e-4791-bf9f-1302c6168bbd\") " pod="glance-kuttl-tests/glance-default-external-api-0" Oct 02 13:19:53 crc kubenswrapper[4724]: I1002 13:19:53.563563 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/43274654-831e-4791-bf9f-1302c6168bbd-var-locks-brick\") pod \"glance-default-external-api-0\" (UID: \"43274654-831e-4791-bf9f-1302c6168bbd\") " pod="glance-kuttl-tests/glance-default-external-api-0" Oct 02 13:19:53 crc kubenswrapper[4724]: I1002 13:19:53.563661 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/43274654-831e-4791-bf9f-1302c6168bbd-config-data\") pod \"glance-default-external-api-0\" (UID: \"43274654-831e-4791-bf9f-1302c6168bbd\") " pod="glance-kuttl-tests/glance-default-external-api-0" Oct 02 13:19:53 crc kubenswrapper[4724]: I1002 13:19:53.563745 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/43274654-831e-4791-bf9f-1302c6168bbd-sys\") pod \"glance-default-external-api-0\" (UID: \"43274654-831e-4791-bf9f-1302c6168bbd\") " pod="glance-kuttl-tests/glance-default-external-api-0" Oct 02 13:19:53 crc kubenswrapper[4724]: I1002 13:19:53.563817 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/43274654-831e-4791-bf9f-1302c6168bbd-etc-nvme\") pod \"glance-default-external-api-0\" (UID: \"43274654-831e-4791-bf9f-1302c6168bbd\") " pod="glance-kuttl-tests/glance-default-external-api-0" Oct 02 13:19:53 crc kubenswrapper[4724]: I1002 13:19:53.563879 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/43274654-831e-4791-bf9f-1302c6168bbd-scripts\") pod \"glance-default-external-api-0\" (UID: \"43274654-831e-4791-bf9f-1302c6168bbd\") " pod="glance-kuttl-tests/glance-default-external-api-0" Oct 02 13:19:53 crc kubenswrapper[4724]: I1002 13:19:53.665710 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8fl44\" (UniqueName: \"kubernetes.io/projected/43274654-831e-4791-bf9f-1302c6168bbd-kube-api-access-8fl44\") pod \"glance-default-external-api-0\" (UID: \"43274654-831e-4791-bf9f-1302c6168bbd\") " pod="glance-kuttl-tests/glance-default-external-api-0" Oct 02 13:19:53 crc kubenswrapper[4724]: I1002 13:19:53.665784 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/43274654-831e-4791-bf9f-1302c6168bbd-var-locks-brick\") pod \"glance-default-external-api-0\" (UID: \"43274654-831e-4791-bf9f-1302c6168bbd\") " pod="glance-kuttl-tests/glance-default-external-api-0" Oct 02 13:19:53 crc kubenswrapper[4724]: I1002 13:19:53.665819 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/43274654-831e-4791-bf9f-1302c6168bbd-config-data\") pod \"glance-default-external-api-0\" (UID: \"43274654-831e-4791-bf9f-1302c6168bbd\") " pod="glance-kuttl-tests/glance-default-external-api-0" Oct 02 13:19:53 crc kubenswrapper[4724]: I1002 13:19:53.665873 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-internal-api-0\" (UID: \"668dadb4-f580-49dc-b725-15366cf424f3\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Oct 02 13:19:53 crc kubenswrapper[4724]: I1002 13:19:53.665919 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/43274654-831e-4791-bf9f-1302c6168bbd-sys\") pod \"glance-default-external-api-0\" (UID: \"43274654-831e-4791-bf9f-1302c6168bbd\") " pod="glance-kuttl-tests/glance-default-external-api-0" Oct 02 13:19:53 crc kubenswrapper[4724]: I1002 13:19:53.665947 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/668dadb4-f580-49dc-b725-15366cf424f3-run\") pod \"glance-default-internal-api-0\" (UID: \"668dadb4-f580-49dc-b725-15366cf424f3\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Oct 02 13:19:53 crc kubenswrapper[4724]: I1002 13:19:53.665973 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/668dadb4-f580-49dc-b725-15366cf424f3-dev\") pod \"glance-default-internal-api-0\" (UID: \"668dadb4-f580-49dc-b725-15366cf424f3\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Oct 02 13:19:53 crc kubenswrapper[4724]: I1002 13:19:53.666001 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/668dadb4-f580-49dc-b725-15366cf424f3-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"668dadb4-f580-49dc-b725-15366cf424f3\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Oct 02 13:19:53 crc kubenswrapper[4724]: I1002 13:19:53.666032 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/668dadb4-f580-49dc-b725-15366cf424f3-etc-iscsi\") pod \"glance-default-internal-api-0\" (UID: \"668dadb4-f580-49dc-b725-15366cf424f3\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Oct 02 13:19:53 crc kubenswrapper[4724]: I1002 13:19:53.666065 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/43274654-831e-4791-bf9f-1302c6168bbd-etc-nvme\") pod \"glance-default-external-api-0\" (UID: \"43274654-831e-4791-bf9f-1302c6168bbd\") " pod="glance-kuttl-tests/glance-default-external-api-0" Oct 02 13:19:53 crc kubenswrapper[4724]: I1002 13:19:53.666092 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/43274654-831e-4791-bf9f-1302c6168bbd-scripts\") pod \"glance-default-external-api-0\" (UID: \"43274654-831e-4791-bf9f-1302c6168bbd\") " pod="glance-kuttl-tests/glance-default-external-api-0" Oct 02 13:19:53 crc kubenswrapper[4724]: I1002 13:19:53.666119 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/668dadb4-f580-49dc-b725-15366cf424f3-scripts\") pod \"glance-default-internal-api-0\" (UID: \"668dadb4-f580-49dc-b725-15366cf424f3\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Oct 02 13:19:53 crc kubenswrapper[4724]: I1002 13:19:53.666157 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/43274654-831e-4791-bf9f-1302c6168bbd-etc-iscsi\") pod \"glance-default-external-api-0\" (UID: \"43274654-831e-4791-bf9f-1302c6168bbd\") " pod="glance-kuttl-tests/glance-default-external-api-0" Oct 02 13:19:53 crc kubenswrapper[4724]: I1002 13:19:53.666183 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/668dadb4-f580-49dc-b725-15366cf424f3-etc-nvme\") pod \"glance-default-internal-api-0\" (UID: \"668dadb4-f580-49dc-b725-15366cf424f3\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Oct 02 13:19:53 crc kubenswrapper[4724]: I1002 13:19:53.666212 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/43274654-831e-4791-bf9f-1302c6168bbd-logs\") pod \"glance-default-external-api-0\" (UID: \"43274654-831e-4791-bf9f-1302c6168bbd\") " pod="glance-kuttl-tests/glance-default-external-api-0" Oct 02 13:19:53 crc kubenswrapper[4724]: I1002 13:19:53.666242 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-external-api-0\" (UID: \"43274654-831e-4791-bf9f-1302c6168bbd\") " pod="glance-kuttl-tests/glance-default-external-api-0" Oct 02 13:19:53 crc kubenswrapper[4724]: I1002 13:19:53.666270 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/43274654-831e-4791-bf9f-1302c6168bbd-dev\") pod \"glance-default-external-api-0\" (UID: \"43274654-831e-4791-bf9f-1302c6168bbd\") " pod="glance-kuttl-tests/glance-default-external-api-0" Oct 02 13:19:53 crc kubenswrapper[4724]: I1002 13:19:53.666306 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/43274654-831e-4791-bf9f-1302c6168bbd-run\") pod \"glance-default-external-api-0\" (UID: \"43274654-831e-4791-bf9f-1302c6168bbd\") " pod="glance-kuttl-tests/glance-default-external-api-0" Oct 02 13:19:53 crc kubenswrapper[4724]: I1002 13:19:53.666332 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/668dadb4-f580-49dc-b725-15366cf424f3-sys\") pod \"glance-default-internal-api-0\" (UID: \"668dadb4-f580-49dc-b725-15366cf424f3\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Oct 02 13:19:53 crc kubenswrapper[4724]: I1002 13:19:53.666365 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-internal-api-0\" (UID: \"668dadb4-f580-49dc-b725-15366cf424f3\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Oct 02 13:19:53 crc kubenswrapper[4724]: I1002 13:19:53.666397 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/668dadb4-f580-49dc-b725-15366cf424f3-config-data\") pod \"glance-default-internal-api-0\" (UID: \"668dadb4-f580-49dc-b725-15366cf424f3\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Oct 02 13:19:53 crc kubenswrapper[4724]: I1002 13:19:53.666429 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/668dadb4-f580-49dc-b725-15366cf424f3-logs\") pod \"glance-default-internal-api-0\" (UID: \"668dadb4-f580-49dc-b725-15366cf424f3\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Oct 02 13:19:53 crc kubenswrapper[4724]: I1002 13:19:53.666463 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-external-api-0\" (UID: \"43274654-831e-4791-bf9f-1302c6168bbd\") " pod="glance-kuttl-tests/glance-default-external-api-0" Oct 02 13:19:53 crc kubenswrapper[4724]: I1002 13:19:53.666486 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/43274654-831e-4791-bf9f-1302c6168bbd-lib-modules\") pod \"glance-default-external-api-0\" (UID: \"43274654-831e-4791-bf9f-1302c6168bbd\") " pod="glance-kuttl-tests/glance-default-external-api-0" Oct 02 13:19:53 crc kubenswrapper[4724]: I1002 13:19:53.666510 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/668dadb4-f580-49dc-b725-15366cf424f3-lib-modules\") pod \"glance-default-internal-api-0\" (UID: \"668dadb4-f580-49dc-b725-15366cf424f3\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Oct 02 13:19:53 crc kubenswrapper[4724]: I1002 13:19:53.666563 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rftdn\" (UniqueName: \"kubernetes.io/projected/668dadb4-f580-49dc-b725-15366cf424f3-kube-api-access-rftdn\") pod \"glance-default-internal-api-0\" (UID: \"668dadb4-f580-49dc-b725-15366cf424f3\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Oct 02 13:19:53 crc kubenswrapper[4724]: I1002 13:19:53.666606 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/43274654-831e-4791-bf9f-1302c6168bbd-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"43274654-831e-4791-bf9f-1302c6168bbd\") " pod="glance-kuttl-tests/glance-default-external-api-0" Oct 02 13:19:53 crc kubenswrapper[4724]: I1002 13:19:53.666638 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/668dadb4-f580-49dc-b725-15366cf424f3-var-locks-brick\") pod \"glance-default-internal-api-0\" (UID: \"668dadb4-f580-49dc-b725-15366cf424f3\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Oct 02 13:19:53 crc kubenswrapper[4724]: I1002 13:19:53.667351 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/43274654-831e-4791-bf9f-1302c6168bbd-lib-modules\") pod \"glance-default-external-api-0\" (UID: \"43274654-831e-4791-bf9f-1302c6168bbd\") " pod="glance-kuttl-tests/glance-default-external-api-0" Oct 02 13:19:53 crc kubenswrapper[4724]: I1002 13:19:53.667374 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/43274654-831e-4791-bf9f-1302c6168bbd-run\") pod \"glance-default-external-api-0\" (UID: \"43274654-831e-4791-bf9f-1302c6168bbd\") " pod="glance-kuttl-tests/glance-default-external-api-0" Oct 02 13:19:53 crc kubenswrapper[4724]: I1002 13:19:53.667655 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/43274654-831e-4791-bf9f-1302c6168bbd-dev\") pod \"glance-default-external-api-0\" (UID: \"43274654-831e-4791-bf9f-1302c6168bbd\") " pod="glance-kuttl-tests/glance-default-external-api-0" Oct 02 13:19:53 crc kubenswrapper[4724]: I1002 13:19:53.667717 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/43274654-831e-4791-bf9f-1302c6168bbd-sys\") pod \"glance-default-external-api-0\" (UID: \"43274654-831e-4791-bf9f-1302c6168bbd\") " pod="glance-kuttl-tests/glance-default-external-api-0" Oct 02 13:19:53 crc kubenswrapper[4724]: I1002 13:19:53.667884 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/43274654-831e-4791-bf9f-1302c6168bbd-logs\") pod \"glance-default-external-api-0\" (UID: \"43274654-831e-4791-bf9f-1302c6168bbd\") " pod="glance-kuttl-tests/glance-default-external-api-0" Oct 02 13:19:53 crc kubenswrapper[4724]: I1002 13:19:53.667897 4724 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-external-api-0\" (UID: \"43274654-831e-4791-bf9f-1302c6168bbd\") device mount path \"/mnt/openstack/pv12\"" pod="glance-kuttl-tests/glance-default-external-api-0" Oct 02 13:19:53 crc kubenswrapper[4724]: I1002 13:19:53.667921 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/43274654-831e-4791-bf9f-1302c6168bbd-etc-iscsi\") pod \"glance-default-external-api-0\" (UID: \"43274654-831e-4791-bf9f-1302c6168bbd\") " pod="glance-kuttl-tests/glance-default-external-api-0" Oct 02 13:19:53 crc kubenswrapper[4724]: I1002 13:19:53.667936 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/43274654-831e-4791-bf9f-1302c6168bbd-etc-nvme\") pod \"glance-default-external-api-0\" (UID: \"43274654-831e-4791-bf9f-1302c6168bbd\") " pod="glance-kuttl-tests/glance-default-external-api-0" Oct 02 13:19:53 crc kubenswrapper[4724]: I1002 13:19:53.668252 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/43274654-831e-4791-bf9f-1302c6168bbd-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"43274654-831e-4791-bf9f-1302c6168bbd\") " pod="glance-kuttl-tests/glance-default-external-api-0" Oct 02 13:19:53 crc kubenswrapper[4724]: I1002 13:19:53.668773 4724 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-external-api-0\" (UID: \"43274654-831e-4791-bf9f-1302c6168bbd\") device mount path \"/mnt/openstack/pv01\"" pod="glance-kuttl-tests/glance-default-external-api-0" Oct 02 13:19:53 crc kubenswrapper[4724]: I1002 13:19:53.671945 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/43274654-831e-4791-bf9f-1302c6168bbd-var-locks-brick\") pod \"glance-default-external-api-0\" (UID: \"43274654-831e-4791-bf9f-1302c6168bbd\") " pod="glance-kuttl-tests/glance-default-external-api-0" Oct 02 13:19:53 crc kubenswrapper[4724]: I1002 13:19:53.677486 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/43274654-831e-4791-bf9f-1302c6168bbd-scripts\") pod \"glance-default-external-api-0\" (UID: \"43274654-831e-4791-bf9f-1302c6168bbd\") " pod="glance-kuttl-tests/glance-default-external-api-0" Oct 02 13:19:53 crc kubenswrapper[4724]: I1002 13:19:53.677864 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/43274654-831e-4791-bf9f-1302c6168bbd-config-data\") pod \"glance-default-external-api-0\" (UID: \"43274654-831e-4791-bf9f-1302c6168bbd\") " pod="glance-kuttl-tests/glance-default-external-api-0" Oct 02 13:19:53 crc kubenswrapper[4724]: I1002 13:19:53.693406 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8fl44\" (UniqueName: \"kubernetes.io/projected/43274654-831e-4791-bf9f-1302c6168bbd-kube-api-access-8fl44\") pod \"glance-default-external-api-0\" (UID: \"43274654-831e-4791-bf9f-1302c6168bbd\") " pod="glance-kuttl-tests/glance-default-external-api-0" Oct 02 13:19:53 crc kubenswrapper[4724]: I1002 13:19:53.693649 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-external-api-0\" (UID: \"43274654-831e-4791-bf9f-1302c6168bbd\") " pod="glance-kuttl-tests/glance-default-external-api-0" Oct 02 13:19:53 crc kubenswrapper[4724]: I1002 13:19:53.708979 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-external-api-0\" (UID: \"43274654-831e-4791-bf9f-1302c6168bbd\") " pod="glance-kuttl-tests/glance-default-external-api-0" Oct 02 13:19:53 crc kubenswrapper[4724]: I1002 13:19:53.768286 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/668dadb4-f580-49dc-b725-15366cf424f3-scripts\") pod \"glance-default-internal-api-0\" (UID: \"668dadb4-f580-49dc-b725-15366cf424f3\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Oct 02 13:19:53 crc kubenswrapper[4724]: I1002 13:19:53.768370 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/668dadb4-f580-49dc-b725-15366cf424f3-etc-nvme\") pod \"glance-default-internal-api-0\" (UID: \"668dadb4-f580-49dc-b725-15366cf424f3\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Oct 02 13:19:53 crc kubenswrapper[4724]: I1002 13:19:53.768425 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/668dadb4-f580-49dc-b725-15366cf424f3-sys\") pod \"glance-default-internal-api-0\" (UID: \"668dadb4-f580-49dc-b725-15366cf424f3\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Oct 02 13:19:53 crc kubenswrapper[4724]: I1002 13:19:53.768461 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-internal-api-0\" (UID: \"668dadb4-f580-49dc-b725-15366cf424f3\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Oct 02 13:19:53 crc kubenswrapper[4724]: I1002 13:19:53.768493 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/668dadb4-f580-49dc-b725-15366cf424f3-config-data\") pod \"glance-default-internal-api-0\" (UID: \"668dadb4-f580-49dc-b725-15366cf424f3\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Oct 02 13:19:53 crc kubenswrapper[4724]: I1002 13:19:53.768529 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/668dadb4-f580-49dc-b725-15366cf424f3-logs\") pod \"glance-default-internal-api-0\" (UID: \"668dadb4-f580-49dc-b725-15366cf424f3\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Oct 02 13:19:53 crc kubenswrapper[4724]: I1002 13:19:53.768578 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/668dadb4-f580-49dc-b725-15366cf424f3-lib-modules\") pod \"glance-default-internal-api-0\" (UID: \"668dadb4-f580-49dc-b725-15366cf424f3\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Oct 02 13:19:53 crc kubenswrapper[4724]: I1002 13:19:53.768607 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rftdn\" (UniqueName: \"kubernetes.io/projected/668dadb4-f580-49dc-b725-15366cf424f3-kube-api-access-rftdn\") pod \"glance-default-internal-api-0\" (UID: \"668dadb4-f580-49dc-b725-15366cf424f3\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Oct 02 13:19:53 crc kubenswrapper[4724]: I1002 13:19:53.768653 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/668dadb4-f580-49dc-b725-15366cf424f3-var-locks-brick\") pod \"glance-default-internal-api-0\" (UID: \"668dadb4-f580-49dc-b725-15366cf424f3\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Oct 02 13:19:53 crc kubenswrapper[4724]: I1002 13:19:53.768699 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-internal-api-0\" (UID: \"668dadb4-f580-49dc-b725-15366cf424f3\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Oct 02 13:19:53 crc kubenswrapper[4724]: I1002 13:19:53.768731 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/668dadb4-f580-49dc-b725-15366cf424f3-run\") pod \"glance-default-internal-api-0\" (UID: \"668dadb4-f580-49dc-b725-15366cf424f3\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Oct 02 13:19:53 crc kubenswrapper[4724]: I1002 13:19:53.768752 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/668dadb4-f580-49dc-b725-15366cf424f3-dev\") pod \"glance-default-internal-api-0\" (UID: \"668dadb4-f580-49dc-b725-15366cf424f3\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Oct 02 13:19:53 crc kubenswrapper[4724]: I1002 13:19:53.768776 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/668dadb4-f580-49dc-b725-15366cf424f3-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"668dadb4-f580-49dc-b725-15366cf424f3\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Oct 02 13:19:53 crc kubenswrapper[4724]: I1002 13:19:53.768808 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/668dadb4-f580-49dc-b725-15366cf424f3-etc-iscsi\") pod \"glance-default-internal-api-0\" (UID: \"668dadb4-f580-49dc-b725-15366cf424f3\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Oct 02 13:19:53 crc kubenswrapper[4724]: I1002 13:19:53.768976 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/668dadb4-f580-49dc-b725-15366cf424f3-etc-iscsi\") pod \"glance-default-internal-api-0\" (UID: \"668dadb4-f580-49dc-b725-15366cf424f3\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Oct 02 13:19:53 crc kubenswrapper[4724]: I1002 13:19:53.769908 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/668dadb4-f580-49dc-b725-15366cf424f3-lib-modules\") pod \"glance-default-internal-api-0\" (UID: \"668dadb4-f580-49dc-b725-15366cf424f3\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Oct 02 13:19:53 crc kubenswrapper[4724]: I1002 13:19:53.769996 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/668dadb4-f580-49dc-b725-15366cf424f3-etc-nvme\") pod \"glance-default-internal-api-0\" (UID: \"668dadb4-f580-49dc-b725-15366cf424f3\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Oct 02 13:19:53 crc kubenswrapper[4724]: I1002 13:19:53.770038 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/668dadb4-f580-49dc-b725-15366cf424f3-sys\") pod \"glance-default-internal-api-0\" (UID: \"668dadb4-f580-49dc-b725-15366cf424f3\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Oct 02 13:19:53 crc kubenswrapper[4724]: I1002 13:19:53.770082 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/668dadb4-f580-49dc-b725-15366cf424f3-dev\") pod \"glance-default-internal-api-0\" (UID: \"668dadb4-f580-49dc-b725-15366cf424f3\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Oct 02 13:19:53 crc kubenswrapper[4724]: I1002 13:19:53.770149 4724 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-internal-api-0\" (UID: \"668dadb4-f580-49dc-b725-15366cf424f3\") device mount path \"/mnt/openstack/pv05\"" pod="glance-kuttl-tests/glance-default-internal-api-0" Oct 02 13:19:53 crc kubenswrapper[4724]: I1002 13:19:53.770177 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/668dadb4-f580-49dc-b725-15366cf424f3-run\") pod \"glance-default-internal-api-0\" (UID: \"668dadb4-f580-49dc-b725-15366cf424f3\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Oct 02 13:19:53 crc kubenswrapper[4724]: I1002 13:19:53.770229 4724 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-internal-api-0\" (UID: \"668dadb4-f580-49dc-b725-15366cf424f3\") device mount path \"/mnt/openstack/pv07\"" pod="glance-kuttl-tests/glance-default-internal-api-0" Oct 02 13:19:53 crc kubenswrapper[4724]: I1002 13:19:53.770409 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/668dadb4-f580-49dc-b725-15366cf424f3-var-locks-brick\") pod \"glance-default-internal-api-0\" (UID: \"668dadb4-f580-49dc-b725-15366cf424f3\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Oct 02 13:19:53 crc kubenswrapper[4724]: I1002 13:19:53.770852 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/668dadb4-f580-49dc-b725-15366cf424f3-logs\") pod \"glance-default-internal-api-0\" (UID: \"668dadb4-f580-49dc-b725-15366cf424f3\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Oct 02 13:19:53 crc kubenswrapper[4724]: I1002 13:19:53.771094 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/668dadb4-f580-49dc-b725-15366cf424f3-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"668dadb4-f580-49dc-b725-15366cf424f3\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Oct 02 13:19:53 crc kubenswrapper[4724]: I1002 13:19:53.775229 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/668dadb4-f580-49dc-b725-15366cf424f3-config-data\") pod \"glance-default-internal-api-0\" (UID: \"668dadb4-f580-49dc-b725-15366cf424f3\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Oct 02 13:19:53 crc kubenswrapper[4724]: I1002 13:19:53.783362 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/668dadb4-f580-49dc-b725-15366cf424f3-scripts\") pod \"glance-default-internal-api-0\" (UID: \"668dadb4-f580-49dc-b725-15366cf424f3\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Oct 02 13:19:53 crc kubenswrapper[4724]: I1002 13:19:53.793459 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-internal-api-0\" (UID: \"668dadb4-f580-49dc-b725-15366cf424f3\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Oct 02 13:19:53 crc kubenswrapper[4724]: I1002 13:19:53.794070 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-internal-api-0\" (UID: \"668dadb4-f580-49dc-b725-15366cf424f3\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Oct 02 13:19:53 crc kubenswrapper[4724]: I1002 13:19:53.794512 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rftdn\" (UniqueName: \"kubernetes.io/projected/668dadb4-f580-49dc-b725-15366cf424f3-kube-api-access-rftdn\") pod \"glance-default-internal-api-0\" (UID: \"668dadb4-f580-49dc-b725-15366cf424f3\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Oct 02 13:19:53 crc kubenswrapper[4724]: I1002 13:19:53.863001 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-internal-api-0" Oct 02 13:19:54 crc kubenswrapper[4724]: I1002 13:19:54.114404 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-external-api-0" Oct 02 13:19:54 crc kubenswrapper[4724]: I1002 13:19:54.124946 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-external-api-0" Oct 02 13:19:54 crc kubenswrapper[4724]: I1002 13:19:54.175415 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/43274654-831e-4791-bf9f-1302c6168bbd-etc-nvme\") pod \"43274654-831e-4791-bf9f-1302c6168bbd\" (UID: \"43274654-831e-4791-bf9f-1302c6168bbd\") " Oct 02 13:19:54 crc kubenswrapper[4724]: I1002 13:19:54.175476 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/43274654-831e-4791-bf9f-1302c6168bbd-run\") pod \"43274654-831e-4791-bf9f-1302c6168bbd\" (UID: \"43274654-831e-4791-bf9f-1302c6168bbd\") " Oct 02 13:19:54 crc kubenswrapper[4724]: I1002 13:19:54.175580 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/43274654-831e-4791-bf9f-1302c6168bbd-config-data\") pod \"43274654-831e-4791-bf9f-1302c6168bbd\" (UID: \"43274654-831e-4791-bf9f-1302c6168bbd\") " Oct 02 13:19:54 crc kubenswrapper[4724]: I1002 13:19:54.175662 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/43274654-831e-4791-bf9f-1302c6168bbd-httpd-run\") pod \"43274654-831e-4791-bf9f-1302c6168bbd\" (UID: \"43274654-831e-4791-bf9f-1302c6168bbd\") " Oct 02 13:19:54 crc kubenswrapper[4724]: I1002 13:19:54.175688 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/43274654-831e-4791-bf9f-1302c6168bbd-lib-modules\") pod \"43274654-831e-4791-bf9f-1302c6168bbd\" (UID: \"43274654-831e-4791-bf9f-1302c6168bbd\") " Oct 02 13:19:54 crc kubenswrapper[4724]: I1002 13:19:54.175715 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/43274654-831e-4791-bf9f-1302c6168bbd-dev\") pod \"43274654-831e-4791-bf9f-1302c6168bbd\" (UID: \"43274654-831e-4791-bf9f-1302c6168bbd\") " Oct 02 13:19:54 crc kubenswrapper[4724]: I1002 13:19:54.175747 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/43274654-831e-4791-bf9f-1302c6168bbd-logs\") pod \"43274654-831e-4791-bf9f-1302c6168bbd\" (UID: \"43274654-831e-4791-bf9f-1302c6168bbd\") " Oct 02 13:19:54 crc kubenswrapper[4724]: I1002 13:19:54.175753 4724 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/43274654-831e-4791-bf9f-1302c6168bbd-etc-nvme" (OuterVolumeSpecName: "etc-nvme") pod "43274654-831e-4791-bf9f-1302c6168bbd" (UID: "43274654-831e-4791-bf9f-1302c6168bbd"). InnerVolumeSpecName "etc-nvme". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 02 13:19:54 crc kubenswrapper[4724]: I1002 13:19:54.175819 4724 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/43274654-831e-4791-bf9f-1302c6168bbd-run" (OuterVolumeSpecName: "run") pod "43274654-831e-4791-bf9f-1302c6168bbd" (UID: "43274654-831e-4791-bf9f-1302c6168bbd"). InnerVolumeSpecName "run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 02 13:19:54 crc kubenswrapper[4724]: I1002 13:19:54.175861 4724 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/43274654-831e-4791-bf9f-1302c6168bbd-lib-modules" (OuterVolumeSpecName: "lib-modules") pod "43274654-831e-4791-bf9f-1302c6168bbd" (UID: "43274654-831e-4791-bf9f-1302c6168bbd"). InnerVolumeSpecName "lib-modules". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 02 13:19:54 crc kubenswrapper[4724]: I1002 13:19:54.175805 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/43274654-831e-4791-bf9f-1302c6168bbd-scripts\") pod \"43274654-831e-4791-bf9f-1302c6168bbd\" (UID: \"43274654-831e-4791-bf9f-1302c6168bbd\") " Oct 02 13:19:54 crc kubenswrapper[4724]: I1002 13:19:54.175982 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8fl44\" (UniqueName: \"kubernetes.io/projected/43274654-831e-4791-bf9f-1302c6168bbd-kube-api-access-8fl44\") pod \"43274654-831e-4791-bf9f-1302c6168bbd\" (UID: \"43274654-831e-4791-bf9f-1302c6168bbd\") " Oct 02 13:19:54 crc kubenswrapper[4724]: I1002 13:19:54.176055 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance-cache\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"43274654-831e-4791-bf9f-1302c6168bbd\" (UID: \"43274654-831e-4791-bf9f-1302c6168bbd\") " Oct 02 13:19:54 crc kubenswrapper[4724]: I1002 13:19:54.176085 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/43274654-831e-4791-bf9f-1302c6168bbd-etc-iscsi\") pod \"43274654-831e-4791-bf9f-1302c6168bbd\" (UID: \"43274654-831e-4791-bf9f-1302c6168bbd\") " Oct 02 13:19:54 crc kubenswrapper[4724]: I1002 13:19:54.176107 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/43274654-831e-4791-bf9f-1302c6168bbd-var-locks-brick\") pod \"43274654-831e-4791-bf9f-1302c6168bbd\" (UID: \"43274654-831e-4791-bf9f-1302c6168bbd\") " Oct 02 13:19:54 crc kubenswrapper[4724]: I1002 13:19:54.176155 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/43274654-831e-4791-bf9f-1302c6168bbd-sys\") pod \"43274654-831e-4791-bf9f-1302c6168bbd\" (UID: \"43274654-831e-4791-bf9f-1302c6168bbd\") " Oct 02 13:19:54 crc kubenswrapper[4724]: I1002 13:19:54.176175 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"43274654-831e-4791-bf9f-1302c6168bbd\" (UID: \"43274654-831e-4791-bf9f-1302c6168bbd\") " Oct 02 13:19:54 crc kubenswrapper[4724]: I1002 13:19:54.176807 4724 reconciler_common.go:293] "Volume detached for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/43274654-831e-4791-bf9f-1302c6168bbd-lib-modules\") on node \"crc\" DevicePath \"\"" Oct 02 13:19:54 crc kubenswrapper[4724]: I1002 13:19:54.176828 4724 reconciler_common.go:293] "Volume detached for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/43274654-831e-4791-bf9f-1302c6168bbd-etc-nvme\") on node \"crc\" DevicePath \"\"" Oct 02 13:19:54 crc kubenswrapper[4724]: I1002 13:19:54.176840 4724 reconciler_common.go:293] "Volume detached for volume \"run\" (UniqueName: \"kubernetes.io/host-path/43274654-831e-4791-bf9f-1302c6168bbd-run\") on node \"crc\" DevicePath \"\"" Oct 02 13:19:54 crc kubenswrapper[4724]: I1002 13:19:54.176909 4724 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/43274654-831e-4791-bf9f-1302c6168bbd-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "43274654-831e-4791-bf9f-1302c6168bbd" (UID: "43274654-831e-4791-bf9f-1302c6168bbd"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 13:19:54 crc kubenswrapper[4724]: I1002 13:19:54.176955 4724 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/43274654-831e-4791-bf9f-1302c6168bbd-dev" (OuterVolumeSpecName: "dev") pod "43274654-831e-4791-bf9f-1302c6168bbd" (UID: "43274654-831e-4791-bf9f-1302c6168bbd"). InnerVolumeSpecName "dev". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 02 13:19:54 crc kubenswrapper[4724]: I1002 13:19:54.176977 4724 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/43274654-831e-4791-bf9f-1302c6168bbd-etc-iscsi" (OuterVolumeSpecName: "etc-iscsi") pod "43274654-831e-4791-bf9f-1302c6168bbd" (UID: "43274654-831e-4791-bf9f-1302c6168bbd"). InnerVolumeSpecName "etc-iscsi". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 02 13:19:54 crc kubenswrapper[4724]: I1002 13:19:54.177670 4724 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/43274654-831e-4791-bf9f-1302c6168bbd-var-locks-brick" (OuterVolumeSpecName: "var-locks-brick") pod "43274654-831e-4791-bf9f-1302c6168bbd" (UID: "43274654-831e-4791-bf9f-1302c6168bbd"). InnerVolumeSpecName "var-locks-brick". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 02 13:19:54 crc kubenswrapper[4724]: I1002 13:19:54.177785 4724 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/43274654-831e-4791-bf9f-1302c6168bbd-sys" (OuterVolumeSpecName: "sys") pod "43274654-831e-4791-bf9f-1302c6168bbd" (UID: "43274654-831e-4791-bf9f-1302c6168bbd"). InnerVolumeSpecName "sys". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 02 13:19:54 crc kubenswrapper[4724]: I1002 13:19:54.178120 4724 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/43274654-831e-4791-bf9f-1302c6168bbd-logs" (OuterVolumeSpecName: "logs") pod "43274654-831e-4791-bf9f-1302c6168bbd" (UID: "43274654-831e-4791-bf9f-1302c6168bbd"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 13:19:54 crc kubenswrapper[4724]: I1002 13:19:54.179669 4724 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage01-crc" (OuterVolumeSpecName: "glance") pod "43274654-831e-4791-bf9f-1302c6168bbd" (UID: "43274654-831e-4791-bf9f-1302c6168bbd"). InnerVolumeSpecName "local-storage01-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 02 13:19:54 crc kubenswrapper[4724]: I1002 13:19:54.179676 4724 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43274654-831e-4791-bf9f-1302c6168bbd-scripts" (OuterVolumeSpecName: "scripts") pod "43274654-831e-4791-bf9f-1302c6168bbd" (UID: "43274654-831e-4791-bf9f-1302c6168bbd"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 13:19:54 crc kubenswrapper[4724]: I1002 13:19:54.180915 4724 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43274654-831e-4791-bf9f-1302c6168bbd-kube-api-access-8fl44" (OuterVolumeSpecName: "kube-api-access-8fl44") pod "43274654-831e-4791-bf9f-1302c6168bbd" (UID: "43274654-831e-4791-bf9f-1302c6168bbd"). InnerVolumeSpecName "kube-api-access-8fl44". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 13:19:54 crc kubenswrapper[4724]: I1002 13:19:54.181372 4724 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43274654-831e-4791-bf9f-1302c6168bbd-config-data" (OuterVolumeSpecName: "config-data") pod "43274654-831e-4791-bf9f-1302c6168bbd" (UID: "43274654-831e-4791-bf9f-1302c6168bbd"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 13:19:54 crc kubenswrapper[4724]: I1002 13:19:54.182337 4724 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage12-crc" (OuterVolumeSpecName: "glance-cache") pod "43274654-831e-4791-bf9f-1302c6168bbd" (UID: "43274654-831e-4791-bf9f-1302c6168bbd"). InnerVolumeSpecName "local-storage12-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 02 13:19:54 crc kubenswrapper[4724]: I1002 13:19:54.279034 4724 reconciler_common.go:293] "Volume detached for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/43274654-831e-4791-bf9f-1302c6168bbd-dev\") on node \"crc\" DevicePath \"\"" Oct 02 13:19:54 crc kubenswrapper[4724]: I1002 13:19:54.279075 4724 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/43274654-831e-4791-bf9f-1302c6168bbd-logs\") on node \"crc\" DevicePath \"\"" Oct 02 13:19:54 crc kubenswrapper[4724]: I1002 13:19:54.279088 4724 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/43274654-831e-4791-bf9f-1302c6168bbd-scripts\") on node \"crc\" DevicePath \"\"" Oct 02 13:19:54 crc kubenswrapper[4724]: I1002 13:19:54.279101 4724 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8fl44\" (UniqueName: \"kubernetes.io/projected/43274654-831e-4791-bf9f-1302c6168bbd-kube-api-access-8fl44\") on node \"crc\" DevicePath \"\"" Oct 02 13:19:54 crc kubenswrapper[4724]: I1002 13:19:54.279144 4724 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") on node \"crc\" " Oct 02 13:19:54 crc kubenswrapper[4724]: I1002 13:19:54.279158 4724 reconciler_common.go:293] "Volume detached for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/43274654-831e-4791-bf9f-1302c6168bbd-etc-iscsi\") on node \"crc\" DevicePath \"\"" Oct 02 13:19:54 crc kubenswrapper[4724]: I1002 13:19:54.279170 4724 reconciler_common.go:293] "Volume detached for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/43274654-831e-4791-bf9f-1302c6168bbd-var-locks-brick\") on node \"crc\" DevicePath \"\"" Oct 02 13:19:54 crc kubenswrapper[4724]: I1002 13:19:54.279185 4724 reconciler_common.go:293] "Volume detached for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/43274654-831e-4791-bf9f-1302c6168bbd-sys\") on node \"crc\" DevicePath \"\"" Oct 02 13:19:54 crc kubenswrapper[4724]: I1002 13:19:54.279204 4724 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" " Oct 02 13:19:54 crc kubenswrapper[4724]: I1002 13:19:54.279217 4724 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/43274654-831e-4791-bf9f-1302c6168bbd-config-data\") on node \"crc\" DevicePath \"\"" Oct 02 13:19:54 crc kubenswrapper[4724]: I1002 13:19:54.279229 4724 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/43274654-831e-4791-bf9f-1302c6168bbd-httpd-run\") on node \"crc\" DevicePath \"\"" Oct 02 13:19:54 crc kubenswrapper[4724]: I1002 13:19:54.292567 4724 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage12-crc" (UniqueName: "kubernetes.io/local-volume/local-storage12-crc") on node "crc" Oct 02 13:19:54 crc kubenswrapper[4724]: I1002 13:19:54.296116 4724 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage01-crc" (UniqueName: "kubernetes.io/local-volume/local-storage01-crc") on node "crc" Oct 02 13:19:54 crc kubenswrapper[4724]: I1002 13:19:54.329638 4724 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-default-internal-api-0"] Oct 02 13:19:54 crc kubenswrapper[4724]: I1002 13:19:54.381342 4724 reconciler_common.go:293] "Volume detached for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") on node \"crc\" DevicePath \"\"" Oct 02 13:19:54 crc kubenswrapper[4724]: I1002 13:19:54.381386 4724 reconciler_common.go:293] "Volume detached for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" DevicePath \"\"" Oct 02 13:19:55 crc kubenswrapper[4724]: I1002 13:19:55.106522 4724 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-default-internal-api-0"] Oct 02 13:19:55 crc kubenswrapper[4724]: I1002 13:19:55.125144 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-external-api-0" Oct 02 13:19:55 crc kubenswrapper[4724]: I1002 13:19:55.125704 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-0" event={"ID":"668dadb4-f580-49dc-b725-15366cf424f3","Type":"ContainerStarted","Data":"df87ec7038b57339cccf71861f65d3bccd3a35ef0cc4adebae7bb7efdc45a8b7"} Oct 02 13:19:55 crc kubenswrapper[4724]: I1002 13:19:55.125774 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-0" event={"ID":"668dadb4-f580-49dc-b725-15366cf424f3","Type":"ContainerStarted","Data":"b8655e8796a15a673495876bfb199b5496812ea51a2a6dcbd4932acaf7134c61"} Oct 02 13:19:55 crc kubenswrapper[4724]: I1002 13:19:55.230973 4724 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-default-external-api-0"] Oct 02 13:19:55 crc kubenswrapper[4724]: I1002 13:19:55.250058 4724 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/glance-default-external-api-0"] Oct 02 13:19:55 crc kubenswrapper[4724]: I1002 13:19:55.278985 4724 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/glance-default-external-api-0"] Oct 02 13:19:55 crc kubenswrapper[4724]: I1002 13:19:55.281706 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-external-api-0" Oct 02 13:19:55 crc kubenswrapper[4724]: I1002 13:19:55.284511 4724 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"glance-default-external-config-data" Oct 02 13:19:55 crc kubenswrapper[4724]: I1002 13:19:55.298691 4724 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-default-external-api-0"] Oct 02 13:19:55 crc kubenswrapper[4724]: I1002 13:19:55.398620 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/240e1579-5ea8-45fe-97e7-707ca8f6622f-config-data\") pod \"glance-default-external-api-0\" (UID: \"240e1579-5ea8-45fe-97e7-707ca8f6622f\") " pod="glance-kuttl-tests/glance-default-external-api-0" Oct 02 13:19:55 crc kubenswrapper[4724]: I1002 13:19:55.399070 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/240e1579-5ea8-45fe-97e7-707ca8f6622f-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"240e1579-5ea8-45fe-97e7-707ca8f6622f\") " pod="glance-kuttl-tests/glance-default-external-api-0" Oct 02 13:19:55 crc kubenswrapper[4724]: I1002 13:19:55.399147 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/240e1579-5ea8-45fe-97e7-707ca8f6622f-etc-nvme\") pod \"glance-default-external-api-0\" (UID: \"240e1579-5ea8-45fe-97e7-707ca8f6622f\") " pod="glance-kuttl-tests/glance-default-external-api-0" Oct 02 13:19:55 crc kubenswrapper[4724]: I1002 13:19:55.399193 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/240e1579-5ea8-45fe-97e7-707ca8f6622f-scripts\") pod \"glance-default-external-api-0\" (UID: \"240e1579-5ea8-45fe-97e7-707ca8f6622f\") " pod="glance-kuttl-tests/glance-default-external-api-0" Oct 02 13:19:55 crc kubenswrapper[4724]: I1002 13:19:55.399219 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/240e1579-5ea8-45fe-97e7-707ca8f6622f-var-locks-brick\") pod \"glance-default-external-api-0\" (UID: \"240e1579-5ea8-45fe-97e7-707ca8f6622f\") " pod="glance-kuttl-tests/glance-default-external-api-0" Oct 02 13:19:55 crc kubenswrapper[4724]: I1002 13:19:55.399251 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l54tx\" (UniqueName: \"kubernetes.io/projected/240e1579-5ea8-45fe-97e7-707ca8f6622f-kube-api-access-l54tx\") pod \"glance-default-external-api-0\" (UID: \"240e1579-5ea8-45fe-97e7-707ca8f6622f\") " pod="glance-kuttl-tests/glance-default-external-api-0" Oct 02 13:19:55 crc kubenswrapper[4724]: I1002 13:19:55.399283 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/240e1579-5ea8-45fe-97e7-707ca8f6622f-etc-iscsi\") pod \"glance-default-external-api-0\" (UID: \"240e1579-5ea8-45fe-97e7-707ca8f6622f\") " pod="glance-kuttl-tests/glance-default-external-api-0" Oct 02 13:19:55 crc kubenswrapper[4724]: I1002 13:19:55.399313 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/240e1579-5ea8-45fe-97e7-707ca8f6622f-logs\") pod \"glance-default-external-api-0\" (UID: \"240e1579-5ea8-45fe-97e7-707ca8f6622f\") " pod="glance-kuttl-tests/glance-default-external-api-0" Oct 02 13:19:55 crc kubenswrapper[4724]: I1002 13:19:55.399343 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-external-api-0\" (UID: \"240e1579-5ea8-45fe-97e7-707ca8f6622f\") " pod="glance-kuttl-tests/glance-default-external-api-0" Oct 02 13:19:55 crc kubenswrapper[4724]: I1002 13:19:55.399374 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/240e1579-5ea8-45fe-97e7-707ca8f6622f-dev\") pod \"glance-default-external-api-0\" (UID: \"240e1579-5ea8-45fe-97e7-707ca8f6622f\") " pod="glance-kuttl-tests/glance-default-external-api-0" Oct 02 13:19:55 crc kubenswrapper[4724]: I1002 13:19:55.399412 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/240e1579-5ea8-45fe-97e7-707ca8f6622f-sys\") pod \"glance-default-external-api-0\" (UID: \"240e1579-5ea8-45fe-97e7-707ca8f6622f\") " pod="glance-kuttl-tests/glance-default-external-api-0" Oct 02 13:19:55 crc kubenswrapper[4724]: I1002 13:19:55.399455 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-external-api-0\" (UID: \"240e1579-5ea8-45fe-97e7-707ca8f6622f\") " pod="glance-kuttl-tests/glance-default-external-api-0" Oct 02 13:19:55 crc kubenswrapper[4724]: I1002 13:19:55.399481 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/240e1579-5ea8-45fe-97e7-707ca8f6622f-run\") pod \"glance-default-external-api-0\" (UID: \"240e1579-5ea8-45fe-97e7-707ca8f6622f\") " pod="glance-kuttl-tests/glance-default-external-api-0" Oct 02 13:19:55 crc kubenswrapper[4724]: I1002 13:19:55.399504 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/240e1579-5ea8-45fe-97e7-707ca8f6622f-lib-modules\") pod \"glance-default-external-api-0\" (UID: \"240e1579-5ea8-45fe-97e7-707ca8f6622f\") " pod="glance-kuttl-tests/glance-default-external-api-0" Oct 02 13:19:55 crc kubenswrapper[4724]: I1002 13:19:55.501423 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-external-api-0\" (UID: \"240e1579-5ea8-45fe-97e7-707ca8f6622f\") " pod="glance-kuttl-tests/glance-default-external-api-0" Oct 02 13:19:55 crc kubenswrapper[4724]: I1002 13:19:55.501783 4724 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-external-api-0\" (UID: \"240e1579-5ea8-45fe-97e7-707ca8f6622f\") device mount path \"/mnt/openstack/pv12\"" pod="glance-kuttl-tests/glance-default-external-api-0" Oct 02 13:19:55 crc kubenswrapper[4724]: I1002 13:19:55.501813 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/240e1579-5ea8-45fe-97e7-707ca8f6622f-run\") pod \"glance-default-external-api-0\" (UID: \"240e1579-5ea8-45fe-97e7-707ca8f6622f\") " pod="glance-kuttl-tests/glance-default-external-api-0" Oct 02 13:19:55 crc kubenswrapper[4724]: I1002 13:19:55.501841 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/240e1579-5ea8-45fe-97e7-707ca8f6622f-lib-modules\") pod \"glance-default-external-api-0\" (UID: \"240e1579-5ea8-45fe-97e7-707ca8f6622f\") " pod="glance-kuttl-tests/glance-default-external-api-0" Oct 02 13:19:55 crc kubenswrapper[4724]: I1002 13:19:55.501910 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/240e1579-5ea8-45fe-97e7-707ca8f6622f-lib-modules\") pod \"glance-default-external-api-0\" (UID: \"240e1579-5ea8-45fe-97e7-707ca8f6622f\") " pod="glance-kuttl-tests/glance-default-external-api-0" Oct 02 13:19:55 crc kubenswrapper[4724]: I1002 13:19:55.501954 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/240e1579-5ea8-45fe-97e7-707ca8f6622f-run\") pod \"glance-default-external-api-0\" (UID: \"240e1579-5ea8-45fe-97e7-707ca8f6622f\") " pod="glance-kuttl-tests/glance-default-external-api-0" Oct 02 13:19:55 crc kubenswrapper[4724]: I1002 13:19:55.501980 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/240e1579-5ea8-45fe-97e7-707ca8f6622f-config-data\") pod \"glance-default-external-api-0\" (UID: \"240e1579-5ea8-45fe-97e7-707ca8f6622f\") " pod="glance-kuttl-tests/glance-default-external-api-0" Oct 02 13:19:55 crc kubenswrapper[4724]: I1002 13:19:55.502002 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/240e1579-5ea8-45fe-97e7-707ca8f6622f-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"240e1579-5ea8-45fe-97e7-707ca8f6622f\") " pod="glance-kuttl-tests/glance-default-external-api-0" Oct 02 13:19:55 crc kubenswrapper[4724]: I1002 13:19:55.502036 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/240e1579-5ea8-45fe-97e7-707ca8f6622f-etc-nvme\") pod \"glance-default-external-api-0\" (UID: \"240e1579-5ea8-45fe-97e7-707ca8f6622f\") " pod="glance-kuttl-tests/glance-default-external-api-0" Oct 02 13:19:55 crc kubenswrapper[4724]: I1002 13:19:55.502071 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/240e1579-5ea8-45fe-97e7-707ca8f6622f-scripts\") pod \"glance-default-external-api-0\" (UID: \"240e1579-5ea8-45fe-97e7-707ca8f6622f\") " pod="glance-kuttl-tests/glance-default-external-api-0" Oct 02 13:19:55 crc kubenswrapper[4724]: I1002 13:19:55.502097 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/240e1579-5ea8-45fe-97e7-707ca8f6622f-var-locks-brick\") pod \"glance-default-external-api-0\" (UID: \"240e1579-5ea8-45fe-97e7-707ca8f6622f\") " pod="glance-kuttl-tests/glance-default-external-api-0" Oct 02 13:19:55 crc kubenswrapper[4724]: I1002 13:19:55.502123 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l54tx\" (UniqueName: \"kubernetes.io/projected/240e1579-5ea8-45fe-97e7-707ca8f6622f-kube-api-access-l54tx\") pod \"glance-default-external-api-0\" (UID: \"240e1579-5ea8-45fe-97e7-707ca8f6622f\") " pod="glance-kuttl-tests/glance-default-external-api-0" Oct 02 13:19:55 crc kubenswrapper[4724]: I1002 13:19:55.502147 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/240e1579-5ea8-45fe-97e7-707ca8f6622f-etc-iscsi\") pod \"glance-default-external-api-0\" (UID: \"240e1579-5ea8-45fe-97e7-707ca8f6622f\") " pod="glance-kuttl-tests/glance-default-external-api-0" Oct 02 13:19:55 crc kubenswrapper[4724]: I1002 13:19:55.502180 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/240e1579-5ea8-45fe-97e7-707ca8f6622f-logs\") pod \"glance-default-external-api-0\" (UID: \"240e1579-5ea8-45fe-97e7-707ca8f6622f\") " pod="glance-kuttl-tests/glance-default-external-api-0" Oct 02 13:19:55 crc kubenswrapper[4724]: I1002 13:19:55.502205 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-external-api-0\" (UID: \"240e1579-5ea8-45fe-97e7-707ca8f6622f\") " pod="glance-kuttl-tests/glance-default-external-api-0" Oct 02 13:19:55 crc kubenswrapper[4724]: I1002 13:19:55.502230 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/240e1579-5ea8-45fe-97e7-707ca8f6622f-dev\") pod \"glance-default-external-api-0\" (UID: \"240e1579-5ea8-45fe-97e7-707ca8f6622f\") " pod="glance-kuttl-tests/glance-default-external-api-0" Oct 02 13:19:55 crc kubenswrapper[4724]: I1002 13:19:55.502257 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/240e1579-5ea8-45fe-97e7-707ca8f6622f-sys\") pod \"glance-default-external-api-0\" (UID: \"240e1579-5ea8-45fe-97e7-707ca8f6622f\") " pod="glance-kuttl-tests/glance-default-external-api-0" Oct 02 13:19:55 crc kubenswrapper[4724]: I1002 13:19:55.502341 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/240e1579-5ea8-45fe-97e7-707ca8f6622f-sys\") pod \"glance-default-external-api-0\" (UID: \"240e1579-5ea8-45fe-97e7-707ca8f6622f\") " pod="glance-kuttl-tests/glance-default-external-api-0" Oct 02 13:19:55 crc kubenswrapper[4724]: I1002 13:19:55.502408 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/240e1579-5ea8-45fe-97e7-707ca8f6622f-var-locks-brick\") pod \"glance-default-external-api-0\" (UID: \"240e1579-5ea8-45fe-97e7-707ca8f6622f\") " pod="glance-kuttl-tests/glance-default-external-api-0" Oct 02 13:19:55 crc kubenswrapper[4724]: I1002 13:19:55.502609 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/240e1579-5ea8-45fe-97e7-707ca8f6622f-etc-iscsi\") pod \"glance-default-external-api-0\" (UID: \"240e1579-5ea8-45fe-97e7-707ca8f6622f\") " pod="glance-kuttl-tests/glance-default-external-api-0" Oct 02 13:19:55 crc kubenswrapper[4724]: I1002 13:19:55.502673 4724 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-external-api-0\" (UID: \"240e1579-5ea8-45fe-97e7-707ca8f6622f\") device mount path \"/mnt/openstack/pv01\"" pod="glance-kuttl-tests/glance-default-external-api-0" Oct 02 13:19:55 crc kubenswrapper[4724]: I1002 13:19:55.502708 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/240e1579-5ea8-45fe-97e7-707ca8f6622f-dev\") pod \"glance-default-external-api-0\" (UID: \"240e1579-5ea8-45fe-97e7-707ca8f6622f\") " pod="glance-kuttl-tests/glance-default-external-api-0" Oct 02 13:19:55 crc kubenswrapper[4724]: I1002 13:19:55.502774 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/240e1579-5ea8-45fe-97e7-707ca8f6622f-etc-nvme\") pod \"glance-default-external-api-0\" (UID: \"240e1579-5ea8-45fe-97e7-707ca8f6622f\") " pod="glance-kuttl-tests/glance-default-external-api-0" Oct 02 13:19:55 crc kubenswrapper[4724]: I1002 13:19:55.503075 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/240e1579-5ea8-45fe-97e7-707ca8f6622f-logs\") pod \"glance-default-external-api-0\" (UID: \"240e1579-5ea8-45fe-97e7-707ca8f6622f\") " pod="glance-kuttl-tests/glance-default-external-api-0" Oct 02 13:19:55 crc kubenswrapper[4724]: I1002 13:19:55.503140 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/240e1579-5ea8-45fe-97e7-707ca8f6622f-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"240e1579-5ea8-45fe-97e7-707ca8f6622f\") " pod="glance-kuttl-tests/glance-default-external-api-0" Oct 02 13:19:55 crc kubenswrapper[4724]: I1002 13:19:55.509368 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/240e1579-5ea8-45fe-97e7-707ca8f6622f-scripts\") pod \"glance-default-external-api-0\" (UID: \"240e1579-5ea8-45fe-97e7-707ca8f6622f\") " pod="glance-kuttl-tests/glance-default-external-api-0" Oct 02 13:19:55 crc kubenswrapper[4724]: I1002 13:19:55.516980 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/240e1579-5ea8-45fe-97e7-707ca8f6622f-config-data\") pod \"glance-default-external-api-0\" (UID: \"240e1579-5ea8-45fe-97e7-707ca8f6622f\") " pod="glance-kuttl-tests/glance-default-external-api-0" Oct 02 13:19:55 crc kubenswrapper[4724]: I1002 13:19:55.535509 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l54tx\" (UniqueName: \"kubernetes.io/projected/240e1579-5ea8-45fe-97e7-707ca8f6622f-kube-api-access-l54tx\") pod \"glance-default-external-api-0\" (UID: \"240e1579-5ea8-45fe-97e7-707ca8f6622f\") " pod="glance-kuttl-tests/glance-default-external-api-0" Oct 02 13:19:55 crc kubenswrapper[4724]: I1002 13:19:55.545040 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-external-api-0\" (UID: \"240e1579-5ea8-45fe-97e7-707ca8f6622f\") " pod="glance-kuttl-tests/glance-default-external-api-0" Oct 02 13:19:55 crc kubenswrapper[4724]: I1002 13:19:55.546747 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-external-api-0\" (UID: \"240e1579-5ea8-45fe-97e7-707ca8f6622f\") " pod="glance-kuttl-tests/glance-default-external-api-0" Oct 02 13:19:55 crc kubenswrapper[4724]: I1002 13:19:55.599280 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-external-api-0" Oct 02 13:19:56 crc kubenswrapper[4724]: I1002 13:19:56.065083 4724 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-default-external-api-0"] Oct 02 13:19:56 crc kubenswrapper[4724]: I1002 13:19:56.145857 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-0" event={"ID":"668dadb4-f580-49dc-b725-15366cf424f3","Type":"ContainerStarted","Data":"8e255ab7a76fed9fb2f05c6b8a760082e9f7273ee269d0926cf5360ea7dcc488"} Oct 02 13:19:56 crc kubenswrapper[4724]: I1002 13:19:56.145932 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-0" event={"ID":"668dadb4-f580-49dc-b725-15366cf424f3","Type":"ContainerStarted","Data":"a91b74756fcdef65ab39054c7f4f9d4707bac3c575bfd4e2e66ec2b138333575"} Oct 02 13:19:56 crc kubenswrapper[4724]: I1002 13:19:56.146112 4724 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-internal-api-0" podUID="668dadb4-f580-49dc-b725-15366cf424f3" containerName="glance-log" containerID="cri-o://df87ec7038b57339cccf71861f65d3bccd3a35ef0cc4adebae7bb7efdc45a8b7" gracePeriod=30 Oct 02 13:19:56 crc kubenswrapper[4724]: I1002 13:19:56.146845 4724 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-internal-api-0" podUID="668dadb4-f580-49dc-b725-15366cf424f3" containerName="glance-api" containerID="cri-o://8e255ab7a76fed9fb2f05c6b8a760082e9f7273ee269d0926cf5360ea7dcc488" gracePeriod=30 Oct 02 13:19:56 crc kubenswrapper[4724]: I1002 13:19:56.146897 4724 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-internal-api-0" podUID="668dadb4-f580-49dc-b725-15366cf424f3" containerName="glance-httpd" containerID="cri-o://a91b74756fcdef65ab39054c7f4f9d4707bac3c575bfd4e2e66ec2b138333575" gracePeriod=30 Oct 02 13:19:56 crc kubenswrapper[4724]: I1002 13:19:56.149772 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-external-api-0" event={"ID":"240e1579-5ea8-45fe-97e7-707ca8f6622f","Type":"ContainerStarted","Data":"152191f7b4caa049f9b7ec4e117af6f68cc7293a161558b78734c1863f3c2512"} Oct 02 13:19:56 crc kubenswrapper[4724]: I1002 13:19:56.181169 4724 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="glance-kuttl-tests/glance-default-internal-api-0" podStartSLOduration=4.181145053 podStartE2EDuration="4.181145053s" podCreationTimestamp="2025-10-02 13:19:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 13:19:56.179382577 +0000 UTC m=+1260.634141688" watchObservedRunningTime="2025-10-02 13:19:56.181145053 +0000 UTC m=+1260.635904174" Oct 02 13:19:56 crc kubenswrapper[4724]: I1002 13:19:56.324504 4724 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43274654-831e-4791-bf9f-1302c6168bbd" path="/var/lib/kubelet/pods/43274654-831e-4791-bf9f-1302c6168bbd/volumes" Oct 02 13:19:57 crc kubenswrapper[4724]: I1002 13:19:57.158595 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-external-api-0" event={"ID":"240e1579-5ea8-45fe-97e7-707ca8f6622f","Type":"ContainerStarted","Data":"51b22269e6c7837234759ef807b8512d80064175a0cfc99c65f3e8c8fb96b755"} Oct 02 13:19:57 crc kubenswrapper[4724]: I1002 13:19:57.162277 4724 generic.go:334] "Generic (PLEG): container finished" podID="668dadb4-f580-49dc-b725-15366cf424f3" containerID="8e255ab7a76fed9fb2f05c6b8a760082e9f7273ee269d0926cf5360ea7dcc488" exitCode=143 Oct 02 13:19:57 crc kubenswrapper[4724]: I1002 13:19:57.162312 4724 generic.go:334] "Generic (PLEG): container finished" podID="668dadb4-f580-49dc-b725-15366cf424f3" containerID="a91b74756fcdef65ab39054c7f4f9d4707bac3c575bfd4e2e66ec2b138333575" exitCode=143 Oct 02 13:19:57 crc kubenswrapper[4724]: I1002 13:19:57.162321 4724 generic.go:334] "Generic (PLEG): container finished" podID="668dadb4-f580-49dc-b725-15366cf424f3" containerID="df87ec7038b57339cccf71861f65d3bccd3a35ef0cc4adebae7bb7efdc45a8b7" exitCode=143 Oct 02 13:19:57 crc kubenswrapper[4724]: I1002 13:19:57.162332 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-0" event={"ID":"668dadb4-f580-49dc-b725-15366cf424f3","Type":"ContainerDied","Data":"8e255ab7a76fed9fb2f05c6b8a760082e9f7273ee269d0926cf5360ea7dcc488"} Oct 02 13:19:57 crc kubenswrapper[4724]: I1002 13:19:57.162358 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-0" event={"ID":"668dadb4-f580-49dc-b725-15366cf424f3","Type":"ContainerDied","Data":"a91b74756fcdef65ab39054c7f4f9d4707bac3c575bfd4e2e66ec2b138333575"} Oct 02 13:19:57 crc kubenswrapper[4724]: I1002 13:19:57.162369 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-0" event={"ID":"668dadb4-f580-49dc-b725-15366cf424f3","Type":"ContainerDied","Data":"df87ec7038b57339cccf71861f65d3bccd3a35ef0cc4adebae7bb7efdc45a8b7"} Oct 02 13:19:57 crc kubenswrapper[4724]: I1002 13:19:57.808259 4724 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-internal-api-0" Oct 02 13:19:57 crc kubenswrapper[4724]: I1002 13:19:57.849226 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/668dadb4-f580-49dc-b725-15366cf424f3-httpd-run\") pod \"668dadb4-f580-49dc-b725-15366cf424f3\" (UID: \"668dadb4-f580-49dc-b725-15366cf424f3\") " Oct 02 13:19:57 crc kubenswrapper[4724]: I1002 13:19:57.849281 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/668dadb4-f580-49dc-b725-15366cf424f3-config-data\") pod \"668dadb4-f580-49dc-b725-15366cf424f3\" (UID: \"668dadb4-f580-49dc-b725-15366cf424f3\") " Oct 02 13:19:57 crc kubenswrapper[4724]: I1002 13:19:57.849310 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/668dadb4-f580-49dc-b725-15366cf424f3-sys\") pod \"668dadb4-f580-49dc-b725-15366cf424f3\" (UID: \"668dadb4-f580-49dc-b725-15366cf424f3\") " Oct 02 13:19:57 crc kubenswrapper[4724]: I1002 13:19:57.849373 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance-cache\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"668dadb4-f580-49dc-b725-15366cf424f3\" (UID: \"668dadb4-f580-49dc-b725-15366cf424f3\") " Oct 02 13:19:57 crc kubenswrapper[4724]: I1002 13:19:57.849437 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/668dadb4-f580-49dc-b725-15366cf424f3-var-locks-brick\") pod \"668dadb4-f580-49dc-b725-15366cf424f3\" (UID: \"668dadb4-f580-49dc-b725-15366cf424f3\") " Oct 02 13:19:57 crc kubenswrapper[4724]: I1002 13:19:57.849490 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/668dadb4-f580-49dc-b725-15366cf424f3-etc-iscsi\") pod \"668dadb4-f580-49dc-b725-15366cf424f3\" (UID: \"668dadb4-f580-49dc-b725-15366cf424f3\") " Oct 02 13:19:57 crc kubenswrapper[4724]: I1002 13:19:57.849551 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/668dadb4-f580-49dc-b725-15366cf424f3-etc-nvme\") pod \"668dadb4-f580-49dc-b725-15366cf424f3\" (UID: \"668dadb4-f580-49dc-b725-15366cf424f3\") " Oct 02 13:19:57 crc kubenswrapper[4724]: I1002 13:19:57.849620 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rftdn\" (UniqueName: \"kubernetes.io/projected/668dadb4-f580-49dc-b725-15366cf424f3-kube-api-access-rftdn\") pod \"668dadb4-f580-49dc-b725-15366cf424f3\" (UID: \"668dadb4-f580-49dc-b725-15366cf424f3\") " Oct 02 13:19:57 crc kubenswrapper[4724]: I1002 13:19:57.849653 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/668dadb4-f580-49dc-b725-15366cf424f3-run\") pod \"668dadb4-f580-49dc-b725-15366cf424f3\" (UID: \"668dadb4-f580-49dc-b725-15366cf424f3\") " Oct 02 13:19:57 crc kubenswrapper[4724]: I1002 13:19:57.849707 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/668dadb4-f580-49dc-b725-15366cf424f3-scripts\") pod \"668dadb4-f580-49dc-b725-15366cf424f3\" (UID: \"668dadb4-f580-49dc-b725-15366cf424f3\") " Oct 02 13:19:57 crc kubenswrapper[4724]: I1002 13:19:57.849740 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/668dadb4-f580-49dc-b725-15366cf424f3-logs\") pod \"668dadb4-f580-49dc-b725-15366cf424f3\" (UID: \"668dadb4-f580-49dc-b725-15366cf424f3\") " Oct 02 13:19:57 crc kubenswrapper[4724]: I1002 13:19:57.849778 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/668dadb4-f580-49dc-b725-15366cf424f3-dev\") pod \"668dadb4-f580-49dc-b725-15366cf424f3\" (UID: \"668dadb4-f580-49dc-b725-15366cf424f3\") " Oct 02 13:19:57 crc kubenswrapper[4724]: I1002 13:19:57.849798 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"668dadb4-f580-49dc-b725-15366cf424f3\" (UID: \"668dadb4-f580-49dc-b725-15366cf424f3\") " Oct 02 13:19:57 crc kubenswrapper[4724]: I1002 13:19:57.849832 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/668dadb4-f580-49dc-b725-15366cf424f3-lib-modules\") pod \"668dadb4-f580-49dc-b725-15366cf424f3\" (UID: \"668dadb4-f580-49dc-b725-15366cf424f3\") " Oct 02 13:19:57 crc kubenswrapper[4724]: I1002 13:19:57.850219 4724 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/668dadb4-f580-49dc-b725-15366cf424f3-lib-modules" (OuterVolumeSpecName: "lib-modules") pod "668dadb4-f580-49dc-b725-15366cf424f3" (UID: "668dadb4-f580-49dc-b725-15366cf424f3"). InnerVolumeSpecName "lib-modules". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 02 13:19:57 crc kubenswrapper[4724]: I1002 13:19:57.850600 4724 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/668dadb4-f580-49dc-b725-15366cf424f3-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "668dadb4-f580-49dc-b725-15366cf424f3" (UID: "668dadb4-f580-49dc-b725-15366cf424f3"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 13:19:57 crc kubenswrapper[4724]: I1002 13:19:57.850906 4724 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/668dadb4-f580-49dc-b725-15366cf424f3-sys" (OuterVolumeSpecName: "sys") pod "668dadb4-f580-49dc-b725-15366cf424f3" (UID: "668dadb4-f580-49dc-b725-15366cf424f3"). InnerVolumeSpecName "sys". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 02 13:19:57 crc kubenswrapper[4724]: I1002 13:19:57.851336 4724 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/668dadb4-f580-49dc-b725-15366cf424f3-run" (OuterVolumeSpecName: "run") pod "668dadb4-f580-49dc-b725-15366cf424f3" (UID: "668dadb4-f580-49dc-b725-15366cf424f3"). InnerVolumeSpecName "run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 02 13:19:57 crc kubenswrapper[4724]: I1002 13:19:57.851415 4724 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/668dadb4-f580-49dc-b725-15366cf424f3-var-locks-brick" (OuterVolumeSpecName: "var-locks-brick") pod "668dadb4-f580-49dc-b725-15366cf424f3" (UID: "668dadb4-f580-49dc-b725-15366cf424f3"). InnerVolumeSpecName "var-locks-brick". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 02 13:19:57 crc kubenswrapper[4724]: I1002 13:19:57.851446 4724 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/668dadb4-f580-49dc-b725-15366cf424f3-etc-iscsi" (OuterVolumeSpecName: "etc-iscsi") pod "668dadb4-f580-49dc-b725-15366cf424f3" (UID: "668dadb4-f580-49dc-b725-15366cf424f3"). InnerVolumeSpecName "etc-iscsi". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 02 13:19:57 crc kubenswrapper[4724]: I1002 13:19:57.851475 4724 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/668dadb4-f580-49dc-b725-15366cf424f3-etc-nvme" (OuterVolumeSpecName: "etc-nvme") pod "668dadb4-f580-49dc-b725-15366cf424f3" (UID: "668dadb4-f580-49dc-b725-15366cf424f3"). InnerVolumeSpecName "etc-nvme". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 02 13:19:57 crc kubenswrapper[4724]: I1002 13:19:57.854066 4724 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/668dadb4-f580-49dc-b725-15366cf424f3-logs" (OuterVolumeSpecName: "logs") pod "668dadb4-f580-49dc-b725-15366cf424f3" (UID: "668dadb4-f580-49dc-b725-15366cf424f3"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 13:19:57 crc kubenswrapper[4724]: I1002 13:19:57.854162 4724 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/668dadb4-f580-49dc-b725-15366cf424f3-dev" (OuterVolumeSpecName: "dev") pod "668dadb4-f580-49dc-b725-15366cf424f3" (UID: "668dadb4-f580-49dc-b725-15366cf424f3"). InnerVolumeSpecName "dev". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 02 13:19:57 crc kubenswrapper[4724]: I1002 13:19:57.856710 4724 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/668dadb4-f580-49dc-b725-15366cf424f3-kube-api-access-rftdn" (OuterVolumeSpecName: "kube-api-access-rftdn") pod "668dadb4-f580-49dc-b725-15366cf424f3" (UID: "668dadb4-f580-49dc-b725-15366cf424f3"). InnerVolumeSpecName "kube-api-access-rftdn". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 13:19:57 crc kubenswrapper[4724]: I1002 13:19:57.860048 4724 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage05-crc" (OuterVolumeSpecName: "glance") pod "668dadb4-f580-49dc-b725-15366cf424f3" (UID: "668dadb4-f580-49dc-b725-15366cf424f3"). InnerVolumeSpecName "local-storage05-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 02 13:19:57 crc kubenswrapper[4724]: I1002 13:19:57.860230 4724 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/668dadb4-f580-49dc-b725-15366cf424f3-scripts" (OuterVolumeSpecName: "scripts") pod "668dadb4-f580-49dc-b725-15366cf424f3" (UID: "668dadb4-f580-49dc-b725-15366cf424f3"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 13:19:57 crc kubenswrapper[4724]: I1002 13:19:57.861727 4724 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage07-crc" (OuterVolumeSpecName: "glance-cache") pod "668dadb4-f580-49dc-b725-15366cf424f3" (UID: "668dadb4-f580-49dc-b725-15366cf424f3"). InnerVolumeSpecName "local-storage07-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 02 13:19:57 crc kubenswrapper[4724]: I1002 13:19:57.950927 4724 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rftdn\" (UniqueName: \"kubernetes.io/projected/668dadb4-f580-49dc-b725-15366cf424f3-kube-api-access-rftdn\") on node \"crc\" DevicePath \"\"" Oct 02 13:19:57 crc kubenswrapper[4724]: I1002 13:19:57.950972 4724 reconciler_common.go:293] "Volume detached for volume \"run\" (UniqueName: \"kubernetes.io/host-path/668dadb4-f580-49dc-b725-15366cf424f3-run\") on node \"crc\" DevicePath \"\"" Oct 02 13:19:57 crc kubenswrapper[4724]: I1002 13:19:57.950985 4724 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/668dadb4-f580-49dc-b725-15366cf424f3-scripts\") on node \"crc\" DevicePath \"\"" Oct 02 13:19:57 crc kubenswrapper[4724]: I1002 13:19:57.950998 4724 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/668dadb4-f580-49dc-b725-15366cf424f3-logs\") on node \"crc\" DevicePath \"\"" Oct 02 13:19:57 crc kubenswrapper[4724]: I1002 13:19:57.951010 4724 reconciler_common.go:293] "Volume detached for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/668dadb4-f580-49dc-b725-15366cf424f3-dev\") on node \"crc\" DevicePath \"\"" Oct 02 13:19:57 crc kubenswrapper[4724]: I1002 13:19:57.951041 4724 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" " Oct 02 13:19:57 crc kubenswrapper[4724]: I1002 13:19:57.951052 4724 reconciler_common.go:293] "Volume detached for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/668dadb4-f580-49dc-b725-15366cf424f3-lib-modules\") on node \"crc\" DevicePath \"\"" Oct 02 13:19:57 crc kubenswrapper[4724]: I1002 13:19:57.951062 4724 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/668dadb4-f580-49dc-b725-15366cf424f3-httpd-run\") on node \"crc\" DevicePath \"\"" Oct 02 13:19:57 crc kubenswrapper[4724]: I1002 13:19:57.951073 4724 reconciler_common.go:293] "Volume detached for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/668dadb4-f580-49dc-b725-15366cf424f3-sys\") on node \"crc\" DevicePath \"\"" Oct 02 13:19:57 crc kubenswrapper[4724]: I1002 13:19:57.951088 4724 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") on node \"crc\" " Oct 02 13:19:57 crc kubenswrapper[4724]: I1002 13:19:57.951096 4724 reconciler_common.go:293] "Volume detached for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/668dadb4-f580-49dc-b725-15366cf424f3-var-locks-brick\") on node \"crc\" DevicePath \"\"" Oct 02 13:19:57 crc kubenswrapper[4724]: I1002 13:19:57.951105 4724 reconciler_common.go:293] "Volume detached for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/668dadb4-f580-49dc-b725-15366cf424f3-etc-iscsi\") on node \"crc\" DevicePath \"\"" Oct 02 13:19:57 crc kubenswrapper[4724]: I1002 13:19:57.951112 4724 reconciler_common.go:293] "Volume detached for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/668dadb4-f580-49dc-b725-15366cf424f3-etc-nvme\") on node \"crc\" DevicePath \"\"" Oct 02 13:19:57 crc kubenswrapper[4724]: I1002 13:19:57.952418 4724 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/668dadb4-f580-49dc-b725-15366cf424f3-config-data" (OuterVolumeSpecName: "config-data") pod "668dadb4-f580-49dc-b725-15366cf424f3" (UID: "668dadb4-f580-49dc-b725-15366cf424f3"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 13:19:57 crc kubenswrapper[4724]: I1002 13:19:57.964833 4724 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage07-crc" (UniqueName: "kubernetes.io/local-volume/local-storage07-crc") on node "crc" Oct 02 13:19:57 crc kubenswrapper[4724]: I1002 13:19:57.964901 4724 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage05-crc" (UniqueName: "kubernetes.io/local-volume/local-storage05-crc") on node "crc" Oct 02 13:19:58 crc kubenswrapper[4724]: I1002 13:19:58.052822 4724 reconciler_common.go:293] "Volume detached for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" DevicePath \"\"" Oct 02 13:19:58 crc kubenswrapper[4724]: I1002 13:19:58.052860 4724 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/668dadb4-f580-49dc-b725-15366cf424f3-config-data\") on node \"crc\" DevicePath \"\"" Oct 02 13:19:58 crc kubenswrapper[4724]: I1002 13:19:58.052876 4724 reconciler_common.go:293] "Volume detached for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") on node \"crc\" DevicePath \"\"" Oct 02 13:19:58 crc kubenswrapper[4724]: I1002 13:19:58.172757 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-external-api-0" event={"ID":"240e1579-5ea8-45fe-97e7-707ca8f6622f","Type":"ContainerStarted","Data":"421f939dbac8a080f35aa96c7af8f9b4b2c0b06931cd0905121ff2675e4cefde"} Oct 02 13:19:58 crc kubenswrapper[4724]: I1002 13:19:58.173310 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-external-api-0" event={"ID":"240e1579-5ea8-45fe-97e7-707ca8f6622f","Type":"ContainerStarted","Data":"b3cffbb350c34ce00f6d928b1e96f3ccb2f186d37ed6ae6de9ebeb76b7ca8827"} Oct 02 13:19:58 crc kubenswrapper[4724]: I1002 13:19:58.175597 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-0" event={"ID":"668dadb4-f580-49dc-b725-15366cf424f3","Type":"ContainerDied","Data":"b8655e8796a15a673495876bfb199b5496812ea51a2a6dcbd4932acaf7134c61"} Oct 02 13:19:58 crc kubenswrapper[4724]: I1002 13:19:58.175677 4724 scope.go:117] "RemoveContainer" containerID="8e255ab7a76fed9fb2f05c6b8a760082e9f7273ee269d0926cf5360ea7dcc488" Oct 02 13:19:58 crc kubenswrapper[4724]: I1002 13:19:58.175695 4724 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-internal-api-0" Oct 02 13:19:58 crc kubenswrapper[4724]: I1002 13:19:58.215774 4724 scope.go:117] "RemoveContainer" containerID="a91b74756fcdef65ab39054c7f4f9d4707bac3c575bfd4e2e66ec2b138333575" Oct 02 13:19:58 crc kubenswrapper[4724]: I1002 13:19:58.247995 4724 scope.go:117] "RemoveContainer" containerID="df87ec7038b57339cccf71861f65d3bccd3a35ef0cc4adebae7bb7efdc45a8b7" Oct 02 13:19:58 crc kubenswrapper[4724]: I1002 13:19:58.248034 4724 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-default-internal-api-0"] Oct 02 13:19:58 crc kubenswrapper[4724]: I1002 13:19:58.265713 4724 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/glance-default-internal-api-0"] Oct 02 13:19:58 crc kubenswrapper[4724]: I1002 13:19:58.272754 4724 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/glance-default-internal-api-0"] Oct 02 13:19:58 crc kubenswrapper[4724]: E1002 13:19:58.275875 4724 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="668dadb4-f580-49dc-b725-15366cf424f3" containerName="glance-api" Oct 02 13:19:58 crc kubenswrapper[4724]: I1002 13:19:58.275925 4724 state_mem.go:107] "Deleted CPUSet assignment" podUID="668dadb4-f580-49dc-b725-15366cf424f3" containerName="glance-api" Oct 02 13:19:58 crc kubenswrapper[4724]: E1002 13:19:58.276007 4724 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="668dadb4-f580-49dc-b725-15366cf424f3" containerName="glance-log" Oct 02 13:19:58 crc kubenswrapper[4724]: I1002 13:19:58.276014 4724 state_mem.go:107] "Deleted CPUSet assignment" podUID="668dadb4-f580-49dc-b725-15366cf424f3" containerName="glance-log" Oct 02 13:19:58 crc kubenswrapper[4724]: E1002 13:19:58.276035 4724 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="668dadb4-f580-49dc-b725-15366cf424f3" containerName="glance-httpd" Oct 02 13:19:58 crc kubenswrapper[4724]: I1002 13:19:58.276046 4724 state_mem.go:107] "Deleted CPUSet assignment" podUID="668dadb4-f580-49dc-b725-15366cf424f3" containerName="glance-httpd" Oct 02 13:19:58 crc kubenswrapper[4724]: I1002 13:19:58.276290 4724 memory_manager.go:354] "RemoveStaleState removing state" podUID="668dadb4-f580-49dc-b725-15366cf424f3" containerName="glance-httpd" Oct 02 13:19:58 crc kubenswrapper[4724]: I1002 13:19:58.276321 4724 memory_manager.go:354] "RemoveStaleState removing state" podUID="668dadb4-f580-49dc-b725-15366cf424f3" containerName="glance-log" Oct 02 13:19:58 crc kubenswrapper[4724]: I1002 13:19:58.276332 4724 memory_manager.go:354] "RemoveStaleState removing state" podUID="668dadb4-f580-49dc-b725-15366cf424f3" containerName="glance-api" Oct 02 13:19:58 crc kubenswrapper[4724]: I1002 13:19:58.277609 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-internal-api-0" Oct 02 13:19:58 crc kubenswrapper[4724]: I1002 13:19:58.279634 4724 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-default-internal-api-0"] Oct 02 13:19:58 crc kubenswrapper[4724]: I1002 13:19:58.279672 4724 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"glance-default-internal-config-data" Oct 02 13:19:58 crc kubenswrapper[4724]: I1002 13:19:58.324901 4724 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="668dadb4-f580-49dc-b725-15366cf424f3" path="/var/lib/kubelet/pods/668dadb4-f580-49dc-b725-15366cf424f3/volumes" Oct 02 13:19:58 crc kubenswrapper[4724]: I1002 13:19:58.459926 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/7d8fb847-abad-4b54-91db-7da2ad47cfb2-etc-nvme\") pod \"glance-default-internal-api-0\" (UID: \"7d8fb847-abad-4b54-91db-7da2ad47cfb2\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Oct 02 13:19:58 crc kubenswrapper[4724]: I1002 13:19:58.460057 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/7d8fb847-abad-4b54-91db-7da2ad47cfb2-sys\") pod \"glance-default-internal-api-0\" (UID: \"7d8fb847-abad-4b54-91db-7da2ad47cfb2\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Oct 02 13:19:58 crc kubenswrapper[4724]: I1002 13:19:58.460075 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/7d8fb847-abad-4b54-91db-7da2ad47cfb2-dev\") pod \"glance-default-internal-api-0\" (UID: \"7d8fb847-abad-4b54-91db-7da2ad47cfb2\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Oct 02 13:19:58 crc kubenswrapper[4724]: I1002 13:19:58.460098 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/7d8fb847-abad-4b54-91db-7da2ad47cfb2-lib-modules\") pod \"glance-default-internal-api-0\" (UID: \"7d8fb847-abad-4b54-91db-7da2ad47cfb2\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Oct 02 13:19:58 crc kubenswrapper[4724]: I1002 13:19:58.460122 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/7d8fb847-abad-4b54-91db-7da2ad47cfb2-etc-iscsi\") pod \"glance-default-internal-api-0\" (UID: \"7d8fb847-abad-4b54-91db-7da2ad47cfb2\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Oct 02 13:19:58 crc kubenswrapper[4724]: I1002 13:19:58.460853 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hpbln\" (UniqueName: \"kubernetes.io/projected/7d8fb847-abad-4b54-91db-7da2ad47cfb2-kube-api-access-hpbln\") pod \"glance-default-internal-api-0\" (UID: \"7d8fb847-abad-4b54-91db-7da2ad47cfb2\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Oct 02 13:19:58 crc kubenswrapper[4724]: I1002 13:19:58.460884 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7d8fb847-abad-4b54-91db-7da2ad47cfb2-config-data\") pod \"glance-default-internal-api-0\" (UID: \"7d8fb847-abad-4b54-91db-7da2ad47cfb2\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Oct 02 13:19:58 crc kubenswrapper[4724]: I1002 13:19:58.460942 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-internal-api-0\" (UID: \"7d8fb847-abad-4b54-91db-7da2ad47cfb2\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Oct 02 13:19:58 crc kubenswrapper[4724]: I1002 13:19:58.460975 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7d8fb847-abad-4b54-91db-7da2ad47cfb2-scripts\") pod \"glance-default-internal-api-0\" (UID: \"7d8fb847-abad-4b54-91db-7da2ad47cfb2\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Oct 02 13:19:58 crc kubenswrapper[4724]: I1002 13:19:58.460995 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/7d8fb847-abad-4b54-91db-7da2ad47cfb2-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"7d8fb847-abad-4b54-91db-7da2ad47cfb2\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Oct 02 13:19:58 crc kubenswrapper[4724]: I1002 13:19:58.461035 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/7d8fb847-abad-4b54-91db-7da2ad47cfb2-run\") pod \"glance-default-internal-api-0\" (UID: \"7d8fb847-abad-4b54-91db-7da2ad47cfb2\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Oct 02 13:19:58 crc kubenswrapper[4724]: I1002 13:19:58.461102 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/7d8fb847-abad-4b54-91db-7da2ad47cfb2-var-locks-brick\") pod \"glance-default-internal-api-0\" (UID: \"7d8fb847-abad-4b54-91db-7da2ad47cfb2\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Oct 02 13:19:58 crc kubenswrapper[4724]: I1002 13:19:58.461196 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7d8fb847-abad-4b54-91db-7da2ad47cfb2-logs\") pod \"glance-default-internal-api-0\" (UID: \"7d8fb847-abad-4b54-91db-7da2ad47cfb2\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Oct 02 13:19:58 crc kubenswrapper[4724]: I1002 13:19:58.461257 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-internal-api-0\" (UID: \"7d8fb847-abad-4b54-91db-7da2ad47cfb2\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Oct 02 13:19:58 crc kubenswrapper[4724]: I1002 13:19:58.562655 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-internal-api-0\" (UID: \"7d8fb847-abad-4b54-91db-7da2ad47cfb2\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Oct 02 13:19:58 crc kubenswrapper[4724]: I1002 13:19:58.562728 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/7d8fb847-abad-4b54-91db-7da2ad47cfb2-etc-nvme\") pod \"glance-default-internal-api-0\" (UID: \"7d8fb847-abad-4b54-91db-7da2ad47cfb2\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Oct 02 13:19:58 crc kubenswrapper[4724]: I1002 13:19:58.562776 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/7d8fb847-abad-4b54-91db-7da2ad47cfb2-sys\") pod \"glance-default-internal-api-0\" (UID: \"7d8fb847-abad-4b54-91db-7da2ad47cfb2\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Oct 02 13:19:58 crc kubenswrapper[4724]: I1002 13:19:58.562799 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/7d8fb847-abad-4b54-91db-7da2ad47cfb2-dev\") pod \"glance-default-internal-api-0\" (UID: \"7d8fb847-abad-4b54-91db-7da2ad47cfb2\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Oct 02 13:19:58 crc kubenswrapper[4724]: I1002 13:19:58.562832 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/7d8fb847-abad-4b54-91db-7da2ad47cfb2-lib-modules\") pod \"glance-default-internal-api-0\" (UID: \"7d8fb847-abad-4b54-91db-7da2ad47cfb2\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Oct 02 13:19:58 crc kubenswrapper[4724]: I1002 13:19:58.562859 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/7d8fb847-abad-4b54-91db-7da2ad47cfb2-etc-iscsi\") pod \"glance-default-internal-api-0\" (UID: \"7d8fb847-abad-4b54-91db-7da2ad47cfb2\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Oct 02 13:19:58 crc kubenswrapper[4724]: I1002 13:19:58.562890 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hpbln\" (UniqueName: \"kubernetes.io/projected/7d8fb847-abad-4b54-91db-7da2ad47cfb2-kube-api-access-hpbln\") pod \"glance-default-internal-api-0\" (UID: \"7d8fb847-abad-4b54-91db-7da2ad47cfb2\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Oct 02 13:19:58 crc kubenswrapper[4724]: I1002 13:19:58.562921 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7d8fb847-abad-4b54-91db-7da2ad47cfb2-config-data\") pod \"glance-default-internal-api-0\" (UID: \"7d8fb847-abad-4b54-91db-7da2ad47cfb2\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Oct 02 13:19:58 crc kubenswrapper[4724]: I1002 13:19:58.562945 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-internal-api-0\" (UID: \"7d8fb847-abad-4b54-91db-7da2ad47cfb2\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Oct 02 13:19:58 crc kubenswrapper[4724]: I1002 13:19:58.562971 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7d8fb847-abad-4b54-91db-7da2ad47cfb2-scripts\") pod \"glance-default-internal-api-0\" (UID: \"7d8fb847-abad-4b54-91db-7da2ad47cfb2\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Oct 02 13:19:58 crc kubenswrapper[4724]: I1002 13:19:58.562993 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/7d8fb847-abad-4b54-91db-7da2ad47cfb2-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"7d8fb847-abad-4b54-91db-7da2ad47cfb2\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Oct 02 13:19:58 crc kubenswrapper[4724]: I1002 13:19:58.563025 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/7d8fb847-abad-4b54-91db-7da2ad47cfb2-run\") pod \"glance-default-internal-api-0\" (UID: \"7d8fb847-abad-4b54-91db-7da2ad47cfb2\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Oct 02 13:19:58 crc kubenswrapper[4724]: I1002 13:19:58.563056 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/7d8fb847-abad-4b54-91db-7da2ad47cfb2-var-locks-brick\") pod \"glance-default-internal-api-0\" (UID: \"7d8fb847-abad-4b54-91db-7da2ad47cfb2\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Oct 02 13:19:58 crc kubenswrapper[4724]: I1002 13:19:58.563083 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7d8fb847-abad-4b54-91db-7da2ad47cfb2-logs\") pod \"glance-default-internal-api-0\" (UID: \"7d8fb847-abad-4b54-91db-7da2ad47cfb2\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Oct 02 13:19:58 crc kubenswrapper[4724]: I1002 13:19:58.563674 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7d8fb847-abad-4b54-91db-7da2ad47cfb2-logs\") pod \"glance-default-internal-api-0\" (UID: \"7d8fb847-abad-4b54-91db-7da2ad47cfb2\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Oct 02 13:19:58 crc kubenswrapper[4724]: I1002 13:19:58.563930 4724 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-internal-api-0\" (UID: \"7d8fb847-abad-4b54-91db-7da2ad47cfb2\") device mount path \"/mnt/openstack/pv07\"" pod="glance-kuttl-tests/glance-default-internal-api-0" Oct 02 13:19:58 crc kubenswrapper[4724]: I1002 13:19:58.569027 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/7d8fb847-abad-4b54-91db-7da2ad47cfb2-dev\") pod \"glance-default-internal-api-0\" (UID: \"7d8fb847-abad-4b54-91db-7da2ad47cfb2\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Oct 02 13:19:58 crc kubenswrapper[4724]: I1002 13:19:58.569137 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/7d8fb847-abad-4b54-91db-7da2ad47cfb2-etc-nvme\") pod \"glance-default-internal-api-0\" (UID: \"7d8fb847-abad-4b54-91db-7da2ad47cfb2\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Oct 02 13:19:58 crc kubenswrapper[4724]: I1002 13:19:58.569167 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/7d8fb847-abad-4b54-91db-7da2ad47cfb2-lib-modules\") pod \"glance-default-internal-api-0\" (UID: \"7d8fb847-abad-4b54-91db-7da2ad47cfb2\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Oct 02 13:19:58 crc kubenswrapper[4724]: I1002 13:19:58.569192 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/7d8fb847-abad-4b54-91db-7da2ad47cfb2-etc-iscsi\") pod \"glance-default-internal-api-0\" (UID: \"7d8fb847-abad-4b54-91db-7da2ad47cfb2\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Oct 02 13:19:58 crc kubenswrapper[4724]: I1002 13:19:58.569198 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/7d8fb847-abad-4b54-91db-7da2ad47cfb2-run\") pod \"glance-default-internal-api-0\" (UID: \"7d8fb847-abad-4b54-91db-7da2ad47cfb2\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Oct 02 13:19:58 crc kubenswrapper[4724]: I1002 13:19:58.569223 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/7d8fb847-abad-4b54-91db-7da2ad47cfb2-sys\") pod \"glance-default-internal-api-0\" (UID: \"7d8fb847-abad-4b54-91db-7da2ad47cfb2\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Oct 02 13:19:58 crc kubenswrapper[4724]: I1002 13:19:58.569386 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/7d8fb847-abad-4b54-91db-7da2ad47cfb2-var-locks-brick\") pod \"glance-default-internal-api-0\" (UID: \"7d8fb847-abad-4b54-91db-7da2ad47cfb2\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Oct 02 13:19:58 crc kubenswrapper[4724]: I1002 13:19:58.569510 4724 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-internal-api-0\" (UID: \"7d8fb847-abad-4b54-91db-7da2ad47cfb2\") device mount path \"/mnt/openstack/pv05\"" pod="glance-kuttl-tests/glance-default-internal-api-0" Oct 02 13:19:58 crc kubenswrapper[4724]: I1002 13:19:58.569674 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/7d8fb847-abad-4b54-91db-7da2ad47cfb2-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"7d8fb847-abad-4b54-91db-7da2ad47cfb2\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Oct 02 13:19:58 crc kubenswrapper[4724]: I1002 13:19:58.576569 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7d8fb847-abad-4b54-91db-7da2ad47cfb2-config-data\") pod \"glance-default-internal-api-0\" (UID: \"7d8fb847-abad-4b54-91db-7da2ad47cfb2\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Oct 02 13:19:58 crc kubenswrapper[4724]: I1002 13:19:58.581273 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7d8fb847-abad-4b54-91db-7da2ad47cfb2-scripts\") pod \"glance-default-internal-api-0\" (UID: \"7d8fb847-abad-4b54-91db-7da2ad47cfb2\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Oct 02 13:19:58 crc kubenswrapper[4724]: I1002 13:19:58.595249 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hpbln\" (UniqueName: \"kubernetes.io/projected/7d8fb847-abad-4b54-91db-7da2ad47cfb2-kube-api-access-hpbln\") pod \"glance-default-internal-api-0\" (UID: \"7d8fb847-abad-4b54-91db-7da2ad47cfb2\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Oct 02 13:19:58 crc kubenswrapper[4724]: I1002 13:19:58.597660 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-internal-api-0\" (UID: \"7d8fb847-abad-4b54-91db-7da2ad47cfb2\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Oct 02 13:19:58 crc kubenswrapper[4724]: I1002 13:19:58.606470 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-internal-api-0\" (UID: \"7d8fb847-abad-4b54-91db-7da2ad47cfb2\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Oct 02 13:19:58 crc kubenswrapper[4724]: I1002 13:19:58.895670 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-internal-api-0" Oct 02 13:19:59 crc kubenswrapper[4724]: I1002 13:19:59.218296 4724 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="glance-kuttl-tests/glance-default-external-api-0" podStartSLOduration=4.218276033 podStartE2EDuration="4.218276033s" podCreationTimestamp="2025-10-02 13:19:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 13:19:59.20544924 +0000 UTC m=+1263.660208361" watchObservedRunningTime="2025-10-02 13:19:59.218276033 +0000 UTC m=+1263.673035154" Oct 02 13:19:59 crc kubenswrapper[4724]: I1002 13:19:59.385962 4724 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-default-internal-api-0"] Oct 02 13:20:00 crc kubenswrapper[4724]: I1002 13:20:00.196968 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-0" event={"ID":"7d8fb847-abad-4b54-91db-7da2ad47cfb2","Type":"ContainerStarted","Data":"51f273c9096461a9ed6e0d66692faa19544900d55284566f7bb5243dc6336a10"} Oct 02 13:20:01 crc kubenswrapper[4724]: I1002 13:20:01.211186 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-0" event={"ID":"7d8fb847-abad-4b54-91db-7da2ad47cfb2","Type":"ContainerStarted","Data":"eab2c993f9bec489d9afc3fbf4cb298e7a50020d63ecde79e9fc177c1ba7446e"} Oct 02 13:20:01 crc kubenswrapper[4724]: I1002 13:20:01.213702 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-0" event={"ID":"7d8fb847-abad-4b54-91db-7da2ad47cfb2","Type":"ContainerStarted","Data":"b02c829820043157c198f4cc84809aa32223afe9edc2460890eeab8b066bb4cd"} Oct 02 13:20:01 crc kubenswrapper[4724]: I1002 13:20:01.213775 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-0" event={"ID":"7d8fb847-abad-4b54-91db-7da2ad47cfb2","Type":"ContainerStarted","Data":"034e2b0b4b27b0b0376d1a9fef56d8c73d318d6ab3cef178e9af28768ba760d7"} Oct 02 13:20:02 crc kubenswrapper[4724]: I1002 13:20:02.266963 4724 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="glance-kuttl-tests/glance-default-internal-api-0" podStartSLOduration=4.266932524 podStartE2EDuration="4.266932524s" podCreationTimestamp="2025-10-02 13:19:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 13:20:02.263180016 +0000 UTC m=+1266.717939177" watchObservedRunningTime="2025-10-02 13:20:02.266932524 +0000 UTC m=+1266.721691685" Oct 02 13:20:05 crc kubenswrapper[4724]: I1002 13:20:05.600015 4724 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="glance-kuttl-tests/glance-default-external-api-0" Oct 02 13:20:05 crc kubenswrapper[4724]: I1002 13:20:05.600698 4724 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="glance-kuttl-tests/glance-default-external-api-0" Oct 02 13:20:05 crc kubenswrapper[4724]: I1002 13:20:05.602286 4724 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="glance-kuttl-tests/glance-default-external-api-0" Oct 02 13:20:05 crc kubenswrapper[4724]: I1002 13:20:05.624815 4724 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="glance-kuttl-tests/glance-default-external-api-0" Oct 02 13:20:05 crc kubenswrapper[4724]: I1002 13:20:05.625203 4724 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="glance-kuttl-tests/glance-default-external-api-0" Oct 02 13:20:05 crc kubenswrapper[4724]: I1002 13:20:05.642419 4724 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="glance-kuttl-tests/glance-default-external-api-0" Oct 02 13:20:06 crc kubenswrapper[4724]: I1002 13:20:06.260592 4724 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/glance-default-external-api-0" Oct 02 13:20:06 crc kubenswrapper[4724]: I1002 13:20:06.260668 4724 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/glance-default-external-api-0" Oct 02 13:20:06 crc kubenswrapper[4724]: I1002 13:20:06.260686 4724 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/glance-default-external-api-0" Oct 02 13:20:06 crc kubenswrapper[4724]: I1002 13:20:06.278284 4724 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/glance-default-external-api-0" Oct 02 13:20:06 crc kubenswrapper[4724]: I1002 13:20:06.279033 4724 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/glance-default-external-api-0" Oct 02 13:20:06 crc kubenswrapper[4724]: I1002 13:20:06.288418 4724 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/glance-default-external-api-0" Oct 02 13:20:08 crc kubenswrapper[4724]: I1002 13:20:08.896197 4724 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="glance-kuttl-tests/glance-default-internal-api-0" Oct 02 13:20:08 crc kubenswrapper[4724]: I1002 13:20:08.896700 4724 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="glance-kuttl-tests/glance-default-internal-api-0" Oct 02 13:20:08 crc kubenswrapper[4724]: I1002 13:20:08.896740 4724 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="glance-kuttl-tests/glance-default-internal-api-0" Oct 02 13:20:08 crc kubenswrapper[4724]: I1002 13:20:08.921391 4724 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="glance-kuttl-tests/glance-default-internal-api-0" Oct 02 13:20:08 crc kubenswrapper[4724]: I1002 13:20:08.921510 4724 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="glance-kuttl-tests/glance-default-internal-api-0" Oct 02 13:20:08 crc kubenswrapper[4724]: I1002 13:20:08.939752 4724 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="glance-kuttl-tests/glance-default-internal-api-0" Oct 02 13:20:09 crc kubenswrapper[4724]: I1002 13:20:09.287274 4724 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/glance-default-internal-api-0" Oct 02 13:20:09 crc kubenswrapper[4724]: I1002 13:20:09.287339 4724 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/glance-default-internal-api-0" Oct 02 13:20:09 crc kubenswrapper[4724]: I1002 13:20:09.287350 4724 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/glance-default-internal-api-0" Oct 02 13:20:09 crc kubenswrapper[4724]: I1002 13:20:09.306695 4724 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/glance-default-internal-api-0" Oct 02 13:20:09 crc kubenswrapper[4724]: I1002 13:20:09.317483 4724 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/glance-default-internal-api-0" Oct 02 13:20:09 crc kubenswrapper[4724]: I1002 13:20:09.317966 4724 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/glance-default-internal-api-0" Oct 02 13:21:34 crc kubenswrapper[4724]: I1002 13:21:34.734400 4724 patch_prober.go:28] interesting pod/machine-config-daemon-74k4t container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 02 13:21:34 crc kubenswrapper[4724]: I1002 13:21:34.735096 4724 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-74k4t" podUID="f6090eaa-c182-4788-950c-16352c271233" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 02 13:22:04 crc kubenswrapper[4724]: I1002 13:22:04.734726 4724 patch_prober.go:28] interesting pod/machine-config-daemon-74k4t container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 02 13:22:04 crc kubenswrapper[4724]: I1002 13:22:04.735255 4724 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-74k4t" podUID="f6090eaa-c182-4788-950c-16352c271233" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 02 13:22:34 crc kubenswrapper[4724]: I1002 13:22:34.734795 4724 patch_prober.go:28] interesting pod/machine-config-daemon-74k4t container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 02 13:22:34 crc kubenswrapper[4724]: I1002 13:22:34.735480 4724 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-74k4t" podUID="f6090eaa-c182-4788-950c-16352c271233" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 02 13:22:34 crc kubenswrapper[4724]: I1002 13:22:34.735604 4724 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-74k4t" Oct 02 13:22:34 crc kubenswrapper[4724]: I1002 13:22:34.736676 4724 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"6ce034e637cd5db41f6c20bcc63e0101e74c2ad03481f5a6d5b4f08ea38e8992"} pod="openshift-machine-config-operator/machine-config-daemon-74k4t" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 02 13:22:34 crc kubenswrapper[4724]: I1002 13:22:34.736823 4724 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-74k4t" podUID="f6090eaa-c182-4788-950c-16352c271233" containerName="machine-config-daemon" containerID="cri-o://6ce034e637cd5db41f6c20bcc63e0101e74c2ad03481f5a6d5b4f08ea38e8992" gracePeriod=600 Oct 02 13:22:35 crc kubenswrapper[4724]: I1002 13:22:35.495641 4724 generic.go:334] "Generic (PLEG): container finished" podID="f6090eaa-c182-4788-950c-16352c271233" containerID="6ce034e637cd5db41f6c20bcc63e0101e74c2ad03481f5a6d5b4f08ea38e8992" exitCode=0 Oct 02 13:22:35 crc kubenswrapper[4724]: I1002 13:22:35.495753 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-74k4t" event={"ID":"f6090eaa-c182-4788-950c-16352c271233","Type":"ContainerDied","Data":"6ce034e637cd5db41f6c20bcc63e0101e74c2ad03481f5a6d5b4f08ea38e8992"} Oct 02 13:22:35 crc kubenswrapper[4724]: I1002 13:22:35.496200 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-74k4t" event={"ID":"f6090eaa-c182-4788-950c-16352c271233","Type":"ContainerStarted","Data":"5cb607848e1db258b2539f214e6e2a3e1c03875bea1d097446a579c07b4f8d5a"} Oct 02 13:22:35 crc kubenswrapper[4724]: I1002 13:22:35.496269 4724 scope.go:117] "RemoveContainer" containerID="d6edbaca1be551c79f462bf303a060c3a2f4d99fd2847faa868ec902caa0b3e8" Oct 02 13:22:46 crc kubenswrapper[4724]: I1002 13:22:46.940449 4724 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-b64kh"] Oct 02 13:22:46 crc kubenswrapper[4724]: I1002 13:22:46.942597 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-b64kh" Oct 02 13:22:46 crc kubenswrapper[4724]: I1002 13:22:46.951900 4724 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-b64kh"] Oct 02 13:22:47 crc kubenswrapper[4724]: I1002 13:22:47.064766 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lfc77\" (UniqueName: \"kubernetes.io/projected/7f2c85de-db9e-4b18-9e5f-f8813e7c212c-kube-api-access-lfc77\") pod \"community-operators-b64kh\" (UID: \"7f2c85de-db9e-4b18-9e5f-f8813e7c212c\") " pod="openshift-marketplace/community-operators-b64kh" Oct 02 13:22:47 crc kubenswrapper[4724]: I1002 13:22:47.064829 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7f2c85de-db9e-4b18-9e5f-f8813e7c212c-utilities\") pod \"community-operators-b64kh\" (UID: \"7f2c85de-db9e-4b18-9e5f-f8813e7c212c\") " pod="openshift-marketplace/community-operators-b64kh" Oct 02 13:22:47 crc kubenswrapper[4724]: I1002 13:22:47.064859 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7f2c85de-db9e-4b18-9e5f-f8813e7c212c-catalog-content\") pod \"community-operators-b64kh\" (UID: \"7f2c85de-db9e-4b18-9e5f-f8813e7c212c\") " pod="openshift-marketplace/community-operators-b64kh" Oct 02 13:22:47 crc kubenswrapper[4724]: I1002 13:22:47.166930 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lfc77\" (UniqueName: \"kubernetes.io/projected/7f2c85de-db9e-4b18-9e5f-f8813e7c212c-kube-api-access-lfc77\") pod \"community-operators-b64kh\" (UID: \"7f2c85de-db9e-4b18-9e5f-f8813e7c212c\") " pod="openshift-marketplace/community-operators-b64kh" Oct 02 13:22:47 crc kubenswrapper[4724]: I1002 13:22:47.167019 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7f2c85de-db9e-4b18-9e5f-f8813e7c212c-utilities\") pod \"community-operators-b64kh\" (UID: \"7f2c85de-db9e-4b18-9e5f-f8813e7c212c\") " pod="openshift-marketplace/community-operators-b64kh" Oct 02 13:22:47 crc kubenswrapper[4724]: I1002 13:22:47.167051 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7f2c85de-db9e-4b18-9e5f-f8813e7c212c-catalog-content\") pod \"community-operators-b64kh\" (UID: \"7f2c85de-db9e-4b18-9e5f-f8813e7c212c\") " pod="openshift-marketplace/community-operators-b64kh" Oct 02 13:22:47 crc kubenswrapper[4724]: I1002 13:22:47.167562 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7f2c85de-db9e-4b18-9e5f-f8813e7c212c-utilities\") pod \"community-operators-b64kh\" (UID: \"7f2c85de-db9e-4b18-9e5f-f8813e7c212c\") " pod="openshift-marketplace/community-operators-b64kh" Oct 02 13:22:47 crc kubenswrapper[4724]: I1002 13:22:47.167656 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7f2c85de-db9e-4b18-9e5f-f8813e7c212c-catalog-content\") pod \"community-operators-b64kh\" (UID: \"7f2c85de-db9e-4b18-9e5f-f8813e7c212c\") " pod="openshift-marketplace/community-operators-b64kh" Oct 02 13:22:47 crc kubenswrapper[4724]: I1002 13:22:47.193061 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lfc77\" (UniqueName: \"kubernetes.io/projected/7f2c85de-db9e-4b18-9e5f-f8813e7c212c-kube-api-access-lfc77\") pod \"community-operators-b64kh\" (UID: \"7f2c85de-db9e-4b18-9e5f-f8813e7c212c\") " pod="openshift-marketplace/community-operators-b64kh" Oct 02 13:22:47 crc kubenswrapper[4724]: I1002 13:22:47.269949 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-b64kh" Oct 02 13:22:47 crc kubenswrapper[4724]: I1002 13:22:47.731805 4724 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-b64kh"] Oct 02 13:22:48 crc kubenswrapper[4724]: I1002 13:22:48.604462 4724 generic.go:334] "Generic (PLEG): container finished" podID="7f2c85de-db9e-4b18-9e5f-f8813e7c212c" containerID="5778c7498237051ce06ba7dfe7a54bf516f253f4c29ed74ac9ce77315e2577d5" exitCode=0 Oct 02 13:22:48 crc kubenswrapper[4724]: I1002 13:22:48.604573 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-b64kh" event={"ID":"7f2c85de-db9e-4b18-9e5f-f8813e7c212c","Type":"ContainerDied","Data":"5778c7498237051ce06ba7dfe7a54bf516f253f4c29ed74ac9ce77315e2577d5"} Oct 02 13:22:48 crc kubenswrapper[4724]: I1002 13:22:48.604849 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-b64kh" event={"ID":"7f2c85de-db9e-4b18-9e5f-f8813e7c212c","Type":"ContainerStarted","Data":"25f63afa9987e18a996fcd321d555c90aa2e1dfaf6ad999a7b4dd82ab3f0a8c2"} Oct 02 13:22:48 crc kubenswrapper[4724]: I1002 13:22:48.606150 4724 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 02 13:22:49 crc kubenswrapper[4724]: I1002 13:22:49.614558 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-b64kh" event={"ID":"7f2c85de-db9e-4b18-9e5f-f8813e7c212c","Type":"ContainerStarted","Data":"2d4657f8f959dbc90497c93dd4a9228e5275a4c90a9800ef39db5ea61b5367f0"} Oct 02 13:22:50 crc kubenswrapper[4724]: I1002 13:22:50.628831 4724 generic.go:334] "Generic (PLEG): container finished" podID="7f2c85de-db9e-4b18-9e5f-f8813e7c212c" containerID="2d4657f8f959dbc90497c93dd4a9228e5275a4c90a9800ef39db5ea61b5367f0" exitCode=0 Oct 02 13:22:50 crc kubenswrapper[4724]: I1002 13:22:50.628905 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-b64kh" event={"ID":"7f2c85de-db9e-4b18-9e5f-f8813e7c212c","Type":"ContainerDied","Data":"2d4657f8f959dbc90497c93dd4a9228e5275a4c90a9800ef39db5ea61b5367f0"} Oct 02 13:22:51 crc kubenswrapper[4724]: I1002 13:22:51.639265 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-b64kh" event={"ID":"7f2c85de-db9e-4b18-9e5f-f8813e7c212c","Type":"ContainerStarted","Data":"27a1ee457577fa2f1455448662540271b96a527e501346f4c42e933f59f16c91"} Oct 02 13:22:51 crc kubenswrapper[4724]: I1002 13:22:51.660328 4724 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-b64kh" podStartSLOduration=3.207737006 podStartE2EDuration="5.660308776s" podCreationTimestamp="2025-10-02 13:22:46 +0000 UTC" firstStartedPulling="2025-10-02 13:22:48.605901787 +0000 UTC m=+1433.060660908" lastFinishedPulling="2025-10-02 13:22:51.058473557 +0000 UTC m=+1435.513232678" observedRunningTime="2025-10-02 13:22:51.654748592 +0000 UTC m=+1436.109507733" watchObservedRunningTime="2025-10-02 13:22:51.660308776 +0000 UTC m=+1436.115067897" Oct 02 13:22:57 crc kubenswrapper[4724]: I1002 13:22:57.018446 4724 scope.go:117] "RemoveContainer" containerID="855ba79ee2030b76fd14e92a5fbea63c6147672e90d9e04bb1c29e6712153ee2" Oct 02 13:22:57 crc kubenswrapper[4724]: I1002 13:22:57.270506 4724 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-b64kh" Oct 02 13:22:57 crc kubenswrapper[4724]: I1002 13:22:57.272045 4724 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-b64kh" Oct 02 13:22:57 crc kubenswrapper[4724]: I1002 13:22:57.345526 4724 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-b64kh" Oct 02 13:22:57 crc kubenswrapper[4724]: I1002 13:22:57.741968 4724 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-b64kh" Oct 02 13:22:57 crc kubenswrapper[4724]: I1002 13:22:57.805453 4724 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-b64kh"] Oct 02 13:22:59 crc kubenswrapper[4724]: I1002 13:22:59.703181 4724 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-b64kh" podUID="7f2c85de-db9e-4b18-9e5f-f8813e7c212c" containerName="registry-server" containerID="cri-o://27a1ee457577fa2f1455448662540271b96a527e501346f4c42e933f59f16c91" gracePeriod=2 Oct 02 13:23:00 crc kubenswrapper[4724]: I1002 13:23:00.130210 4724 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-b64kh" Oct 02 13:23:00 crc kubenswrapper[4724]: I1002 13:23:00.280925 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7f2c85de-db9e-4b18-9e5f-f8813e7c212c-utilities\") pod \"7f2c85de-db9e-4b18-9e5f-f8813e7c212c\" (UID: \"7f2c85de-db9e-4b18-9e5f-f8813e7c212c\") " Oct 02 13:23:00 crc kubenswrapper[4724]: I1002 13:23:00.280996 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7f2c85de-db9e-4b18-9e5f-f8813e7c212c-catalog-content\") pod \"7f2c85de-db9e-4b18-9e5f-f8813e7c212c\" (UID: \"7f2c85de-db9e-4b18-9e5f-f8813e7c212c\") " Oct 02 13:23:00 crc kubenswrapper[4724]: I1002 13:23:00.281196 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lfc77\" (UniqueName: \"kubernetes.io/projected/7f2c85de-db9e-4b18-9e5f-f8813e7c212c-kube-api-access-lfc77\") pod \"7f2c85de-db9e-4b18-9e5f-f8813e7c212c\" (UID: \"7f2c85de-db9e-4b18-9e5f-f8813e7c212c\") " Oct 02 13:23:00 crc kubenswrapper[4724]: I1002 13:23:00.282039 4724 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7f2c85de-db9e-4b18-9e5f-f8813e7c212c-utilities" (OuterVolumeSpecName: "utilities") pod "7f2c85de-db9e-4b18-9e5f-f8813e7c212c" (UID: "7f2c85de-db9e-4b18-9e5f-f8813e7c212c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 13:23:00 crc kubenswrapper[4724]: I1002 13:23:00.288721 4724 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7f2c85de-db9e-4b18-9e5f-f8813e7c212c-kube-api-access-lfc77" (OuterVolumeSpecName: "kube-api-access-lfc77") pod "7f2c85de-db9e-4b18-9e5f-f8813e7c212c" (UID: "7f2c85de-db9e-4b18-9e5f-f8813e7c212c"). InnerVolumeSpecName "kube-api-access-lfc77". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 13:23:00 crc kubenswrapper[4724]: I1002 13:23:00.350369 4724 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7f2c85de-db9e-4b18-9e5f-f8813e7c212c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "7f2c85de-db9e-4b18-9e5f-f8813e7c212c" (UID: "7f2c85de-db9e-4b18-9e5f-f8813e7c212c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 13:23:00 crc kubenswrapper[4724]: I1002 13:23:00.382905 4724 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7f2c85de-db9e-4b18-9e5f-f8813e7c212c-utilities\") on node \"crc\" DevicePath \"\"" Oct 02 13:23:00 crc kubenswrapper[4724]: I1002 13:23:00.382951 4724 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7f2c85de-db9e-4b18-9e5f-f8813e7c212c-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 02 13:23:00 crc kubenswrapper[4724]: I1002 13:23:00.382972 4724 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lfc77\" (UniqueName: \"kubernetes.io/projected/7f2c85de-db9e-4b18-9e5f-f8813e7c212c-kube-api-access-lfc77\") on node \"crc\" DevicePath \"\"" Oct 02 13:23:00 crc kubenswrapper[4724]: I1002 13:23:00.714748 4724 generic.go:334] "Generic (PLEG): container finished" podID="7f2c85de-db9e-4b18-9e5f-f8813e7c212c" containerID="27a1ee457577fa2f1455448662540271b96a527e501346f4c42e933f59f16c91" exitCode=0 Oct 02 13:23:00 crc kubenswrapper[4724]: I1002 13:23:00.714811 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-b64kh" event={"ID":"7f2c85de-db9e-4b18-9e5f-f8813e7c212c","Type":"ContainerDied","Data":"27a1ee457577fa2f1455448662540271b96a527e501346f4c42e933f59f16c91"} Oct 02 13:23:00 crc kubenswrapper[4724]: I1002 13:23:00.714820 4724 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-b64kh" Oct 02 13:23:00 crc kubenswrapper[4724]: I1002 13:23:00.714861 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-b64kh" event={"ID":"7f2c85de-db9e-4b18-9e5f-f8813e7c212c","Type":"ContainerDied","Data":"25f63afa9987e18a996fcd321d555c90aa2e1dfaf6ad999a7b4dd82ab3f0a8c2"} Oct 02 13:23:00 crc kubenswrapper[4724]: I1002 13:23:00.715096 4724 scope.go:117] "RemoveContainer" containerID="27a1ee457577fa2f1455448662540271b96a527e501346f4c42e933f59f16c91" Oct 02 13:23:00 crc kubenswrapper[4724]: I1002 13:23:00.734684 4724 scope.go:117] "RemoveContainer" containerID="2d4657f8f959dbc90497c93dd4a9228e5275a4c90a9800ef39db5ea61b5367f0" Oct 02 13:23:00 crc kubenswrapper[4724]: I1002 13:23:00.750489 4724 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-b64kh"] Oct 02 13:23:00 crc kubenswrapper[4724]: I1002 13:23:00.757243 4724 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-b64kh"] Oct 02 13:23:00 crc kubenswrapper[4724]: I1002 13:23:00.757333 4724 scope.go:117] "RemoveContainer" containerID="5778c7498237051ce06ba7dfe7a54bf516f253f4c29ed74ac9ce77315e2577d5" Oct 02 13:23:00 crc kubenswrapper[4724]: I1002 13:23:00.793965 4724 scope.go:117] "RemoveContainer" containerID="27a1ee457577fa2f1455448662540271b96a527e501346f4c42e933f59f16c91" Oct 02 13:23:00 crc kubenswrapper[4724]: E1002 13:23:00.798197 4724 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"27a1ee457577fa2f1455448662540271b96a527e501346f4c42e933f59f16c91\": container with ID starting with 27a1ee457577fa2f1455448662540271b96a527e501346f4c42e933f59f16c91 not found: ID does not exist" containerID="27a1ee457577fa2f1455448662540271b96a527e501346f4c42e933f59f16c91" Oct 02 13:23:00 crc kubenswrapper[4724]: I1002 13:23:00.798263 4724 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"27a1ee457577fa2f1455448662540271b96a527e501346f4c42e933f59f16c91"} err="failed to get container status \"27a1ee457577fa2f1455448662540271b96a527e501346f4c42e933f59f16c91\": rpc error: code = NotFound desc = could not find container \"27a1ee457577fa2f1455448662540271b96a527e501346f4c42e933f59f16c91\": container with ID starting with 27a1ee457577fa2f1455448662540271b96a527e501346f4c42e933f59f16c91 not found: ID does not exist" Oct 02 13:23:00 crc kubenswrapper[4724]: I1002 13:23:00.798296 4724 scope.go:117] "RemoveContainer" containerID="2d4657f8f959dbc90497c93dd4a9228e5275a4c90a9800ef39db5ea61b5367f0" Oct 02 13:23:00 crc kubenswrapper[4724]: E1002 13:23:00.799085 4724 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2d4657f8f959dbc90497c93dd4a9228e5275a4c90a9800ef39db5ea61b5367f0\": container with ID starting with 2d4657f8f959dbc90497c93dd4a9228e5275a4c90a9800ef39db5ea61b5367f0 not found: ID does not exist" containerID="2d4657f8f959dbc90497c93dd4a9228e5275a4c90a9800ef39db5ea61b5367f0" Oct 02 13:23:00 crc kubenswrapper[4724]: I1002 13:23:00.799123 4724 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2d4657f8f959dbc90497c93dd4a9228e5275a4c90a9800ef39db5ea61b5367f0"} err="failed to get container status \"2d4657f8f959dbc90497c93dd4a9228e5275a4c90a9800ef39db5ea61b5367f0\": rpc error: code = NotFound desc = could not find container \"2d4657f8f959dbc90497c93dd4a9228e5275a4c90a9800ef39db5ea61b5367f0\": container with ID starting with 2d4657f8f959dbc90497c93dd4a9228e5275a4c90a9800ef39db5ea61b5367f0 not found: ID does not exist" Oct 02 13:23:00 crc kubenswrapper[4724]: I1002 13:23:00.799151 4724 scope.go:117] "RemoveContainer" containerID="5778c7498237051ce06ba7dfe7a54bf516f253f4c29ed74ac9ce77315e2577d5" Oct 02 13:23:00 crc kubenswrapper[4724]: E1002 13:23:00.799675 4724 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5778c7498237051ce06ba7dfe7a54bf516f253f4c29ed74ac9ce77315e2577d5\": container with ID starting with 5778c7498237051ce06ba7dfe7a54bf516f253f4c29ed74ac9ce77315e2577d5 not found: ID does not exist" containerID="5778c7498237051ce06ba7dfe7a54bf516f253f4c29ed74ac9ce77315e2577d5" Oct 02 13:23:00 crc kubenswrapper[4724]: I1002 13:23:00.799708 4724 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5778c7498237051ce06ba7dfe7a54bf516f253f4c29ed74ac9ce77315e2577d5"} err="failed to get container status \"5778c7498237051ce06ba7dfe7a54bf516f253f4c29ed74ac9ce77315e2577d5\": rpc error: code = NotFound desc = could not find container \"5778c7498237051ce06ba7dfe7a54bf516f253f4c29ed74ac9ce77315e2577d5\": container with ID starting with 5778c7498237051ce06ba7dfe7a54bf516f253f4c29ed74ac9ce77315e2577d5 not found: ID does not exist" Oct 02 13:23:02 crc kubenswrapper[4724]: I1002 13:23:02.324637 4724 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7f2c85de-db9e-4b18-9e5f-f8813e7c212c" path="/var/lib/kubelet/pods/7f2c85de-db9e-4b18-9e5f-f8813e7c212c/volumes" Oct 02 13:23:49 crc kubenswrapper[4724]: I1002 13:23:49.649942 4724 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-wcwqd"] Oct 02 13:23:49 crc kubenswrapper[4724]: E1002 13:23:49.650839 4724 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7f2c85de-db9e-4b18-9e5f-f8813e7c212c" containerName="extract-utilities" Oct 02 13:23:49 crc kubenswrapper[4724]: I1002 13:23:49.650854 4724 state_mem.go:107] "Deleted CPUSet assignment" podUID="7f2c85de-db9e-4b18-9e5f-f8813e7c212c" containerName="extract-utilities" Oct 02 13:23:49 crc kubenswrapper[4724]: E1002 13:23:49.650881 4724 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7f2c85de-db9e-4b18-9e5f-f8813e7c212c" containerName="registry-server" Oct 02 13:23:49 crc kubenswrapper[4724]: I1002 13:23:49.650889 4724 state_mem.go:107] "Deleted CPUSet assignment" podUID="7f2c85de-db9e-4b18-9e5f-f8813e7c212c" containerName="registry-server" Oct 02 13:23:49 crc kubenswrapper[4724]: E1002 13:23:49.650911 4724 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7f2c85de-db9e-4b18-9e5f-f8813e7c212c" containerName="extract-content" Oct 02 13:23:49 crc kubenswrapper[4724]: I1002 13:23:49.650919 4724 state_mem.go:107] "Deleted CPUSet assignment" podUID="7f2c85de-db9e-4b18-9e5f-f8813e7c212c" containerName="extract-content" Oct 02 13:23:49 crc kubenswrapper[4724]: I1002 13:23:49.651079 4724 memory_manager.go:354] "RemoveStaleState removing state" podUID="7f2c85de-db9e-4b18-9e5f-f8813e7c212c" containerName="registry-server" Oct 02 13:23:49 crc kubenswrapper[4724]: I1002 13:23:49.652279 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-wcwqd" Oct 02 13:23:49 crc kubenswrapper[4724]: I1002 13:23:49.678075 4724 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-wcwqd"] Oct 02 13:23:49 crc kubenswrapper[4724]: I1002 13:23:49.689597 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/189b80fa-5eda-46e8-a057-95929ee8a63c-catalog-content\") pod \"certified-operators-wcwqd\" (UID: \"189b80fa-5eda-46e8-a057-95929ee8a63c\") " pod="openshift-marketplace/certified-operators-wcwqd" Oct 02 13:23:49 crc kubenswrapper[4724]: I1002 13:23:49.689675 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f8zwj\" (UniqueName: \"kubernetes.io/projected/189b80fa-5eda-46e8-a057-95929ee8a63c-kube-api-access-f8zwj\") pod \"certified-operators-wcwqd\" (UID: \"189b80fa-5eda-46e8-a057-95929ee8a63c\") " pod="openshift-marketplace/certified-operators-wcwqd" Oct 02 13:23:49 crc kubenswrapper[4724]: I1002 13:23:49.689730 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/189b80fa-5eda-46e8-a057-95929ee8a63c-utilities\") pod \"certified-operators-wcwqd\" (UID: \"189b80fa-5eda-46e8-a057-95929ee8a63c\") " pod="openshift-marketplace/certified-operators-wcwqd" Oct 02 13:23:49 crc kubenswrapper[4724]: I1002 13:23:49.790967 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/189b80fa-5eda-46e8-a057-95929ee8a63c-catalog-content\") pod \"certified-operators-wcwqd\" (UID: \"189b80fa-5eda-46e8-a057-95929ee8a63c\") " pod="openshift-marketplace/certified-operators-wcwqd" Oct 02 13:23:49 crc kubenswrapper[4724]: I1002 13:23:49.791052 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f8zwj\" (UniqueName: \"kubernetes.io/projected/189b80fa-5eda-46e8-a057-95929ee8a63c-kube-api-access-f8zwj\") pod \"certified-operators-wcwqd\" (UID: \"189b80fa-5eda-46e8-a057-95929ee8a63c\") " pod="openshift-marketplace/certified-operators-wcwqd" Oct 02 13:23:49 crc kubenswrapper[4724]: I1002 13:23:49.791098 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/189b80fa-5eda-46e8-a057-95929ee8a63c-utilities\") pod \"certified-operators-wcwqd\" (UID: \"189b80fa-5eda-46e8-a057-95929ee8a63c\") " pod="openshift-marketplace/certified-operators-wcwqd" Oct 02 13:23:49 crc kubenswrapper[4724]: I1002 13:23:49.791571 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/189b80fa-5eda-46e8-a057-95929ee8a63c-catalog-content\") pod \"certified-operators-wcwqd\" (UID: \"189b80fa-5eda-46e8-a057-95929ee8a63c\") " pod="openshift-marketplace/certified-operators-wcwqd" Oct 02 13:23:49 crc kubenswrapper[4724]: I1002 13:23:49.791694 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/189b80fa-5eda-46e8-a057-95929ee8a63c-utilities\") pod \"certified-operators-wcwqd\" (UID: \"189b80fa-5eda-46e8-a057-95929ee8a63c\") " pod="openshift-marketplace/certified-operators-wcwqd" Oct 02 13:23:49 crc kubenswrapper[4724]: I1002 13:23:49.811501 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f8zwj\" (UniqueName: \"kubernetes.io/projected/189b80fa-5eda-46e8-a057-95929ee8a63c-kube-api-access-f8zwj\") pod \"certified-operators-wcwqd\" (UID: \"189b80fa-5eda-46e8-a057-95929ee8a63c\") " pod="openshift-marketplace/certified-operators-wcwqd" Oct 02 13:23:49 crc kubenswrapper[4724]: I1002 13:23:49.968609 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-wcwqd" Oct 02 13:23:50 crc kubenswrapper[4724]: I1002 13:23:50.518443 4724 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-wcwqd"] Oct 02 13:23:51 crc kubenswrapper[4724]: I1002 13:23:51.157968 4724 generic.go:334] "Generic (PLEG): container finished" podID="189b80fa-5eda-46e8-a057-95929ee8a63c" containerID="eda709ec96e805af926b7ea24a3f34f2457ad367deae5187a60d98ea58f49a50" exitCode=0 Oct 02 13:23:51 crc kubenswrapper[4724]: I1002 13:23:51.158024 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wcwqd" event={"ID":"189b80fa-5eda-46e8-a057-95929ee8a63c","Type":"ContainerDied","Data":"eda709ec96e805af926b7ea24a3f34f2457ad367deae5187a60d98ea58f49a50"} Oct 02 13:23:51 crc kubenswrapper[4724]: I1002 13:23:51.158056 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wcwqd" event={"ID":"189b80fa-5eda-46e8-a057-95929ee8a63c","Type":"ContainerStarted","Data":"17ee0ef2f81afe1988c2114b62b79b46e9f00b29c98a09f8b5f35a7808d05815"} Oct 02 13:23:53 crc kubenswrapper[4724]: I1002 13:23:53.173524 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wcwqd" event={"ID":"189b80fa-5eda-46e8-a057-95929ee8a63c","Type":"ContainerStarted","Data":"d43e3326dbe7203704268195b749454662b9fb4b6f688a1ca73684170ed9f1cc"} Oct 02 13:23:54 crc kubenswrapper[4724]: I1002 13:23:54.181990 4724 generic.go:334] "Generic (PLEG): container finished" podID="189b80fa-5eda-46e8-a057-95929ee8a63c" containerID="d43e3326dbe7203704268195b749454662b9fb4b6f688a1ca73684170ed9f1cc" exitCode=0 Oct 02 13:23:54 crc kubenswrapper[4724]: I1002 13:23:54.182031 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wcwqd" event={"ID":"189b80fa-5eda-46e8-a057-95929ee8a63c","Type":"ContainerDied","Data":"d43e3326dbe7203704268195b749454662b9fb4b6f688a1ca73684170ed9f1cc"} Oct 02 13:23:55 crc kubenswrapper[4724]: I1002 13:23:55.190936 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wcwqd" event={"ID":"189b80fa-5eda-46e8-a057-95929ee8a63c","Type":"ContainerStarted","Data":"2e4113505f676c9e26c2f370770fbe7deccc88633cb833dbca02209902756b89"} Oct 02 13:23:55 crc kubenswrapper[4724]: I1002 13:23:55.215407 4724 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-wcwqd" podStartSLOduration=2.442710687 podStartE2EDuration="6.215383254s" podCreationTimestamp="2025-10-02 13:23:49 +0000 UTC" firstStartedPulling="2025-10-02 13:23:51.161528659 +0000 UTC m=+1495.616287780" lastFinishedPulling="2025-10-02 13:23:54.934201216 +0000 UTC m=+1499.388960347" observedRunningTime="2025-10-02 13:23:55.212151151 +0000 UTC m=+1499.666910272" watchObservedRunningTime="2025-10-02 13:23:55.215383254 +0000 UTC m=+1499.670142375" Oct 02 13:23:57 crc kubenswrapper[4724]: I1002 13:23:57.081637 4724 scope.go:117] "RemoveContainer" containerID="af3133398d7664a5460f6ea8651515a58d074973e709d0f7249da215d86e3c20" Oct 02 13:23:57 crc kubenswrapper[4724]: I1002 13:23:57.102483 4724 scope.go:117] "RemoveContainer" containerID="29a33a00e43225a41a31bc5b7da3b75a51b4d974afce86f87c7f8f77cac9c8cd" Oct 02 13:23:59 crc kubenswrapper[4724]: I1002 13:23:59.969775 4724 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-wcwqd" Oct 02 13:23:59 crc kubenswrapper[4724]: I1002 13:23:59.970300 4724 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-wcwqd" Oct 02 13:24:00 crc kubenswrapper[4724]: I1002 13:24:00.015360 4724 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-wcwqd" Oct 02 13:24:00 crc kubenswrapper[4724]: I1002 13:24:00.276041 4724 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-wcwqd" Oct 02 13:24:00 crc kubenswrapper[4724]: I1002 13:24:00.325459 4724 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-wcwqd"] Oct 02 13:24:02 crc kubenswrapper[4724]: I1002 13:24:02.251885 4724 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-wcwqd" podUID="189b80fa-5eda-46e8-a057-95929ee8a63c" containerName="registry-server" containerID="cri-o://2e4113505f676c9e26c2f370770fbe7deccc88633cb833dbca02209902756b89" gracePeriod=2 Oct 02 13:24:02 crc kubenswrapper[4724]: I1002 13:24:02.657335 4724 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-wcwqd" Oct 02 13:24:02 crc kubenswrapper[4724]: I1002 13:24:02.792418 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/189b80fa-5eda-46e8-a057-95929ee8a63c-catalog-content\") pod \"189b80fa-5eda-46e8-a057-95929ee8a63c\" (UID: \"189b80fa-5eda-46e8-a057-95929ee8a63c\") " Oct 02 13:24:02 crc kubenswrapper[4724]: I1002 13:24:02.792584 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f8zwj\" (UniqueName: \"kubernetes.io/projected/189b80fa-5eda-46e8-a057-95929ee8a63c-kube-api-access-f8zwj\") pod \"189b80fa-5eda-46e8-a057-95929ee8a63c\" (UID: \"189b80fa-5eda-46e8-a057-95929ee8a63c\") " Oct 02 13:24:02 crc kubenswrapper[4724]: I1002 13:24:02.792624 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/189b80fa-5eda-46e8-a057-95929ee8a63c-utilities\") pod \"189b80fa-5eda-46e8-a057-95929ee8a63c\" (UID: \"189b80fa-5eda-46e8-a057-95929ee8a63c\") " Oct 02 13:24:02 crc kubenswrapper[4724]: I1002 13:24:02.794352 4724 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/189b80fa-5eda-46e8-a057-95929ee8a63c-utilities" (OuterVolumeSpecName: "utilities") pod "189b80fa-5eda-46e8-a057-95929ee8a63c" (UID: "189b80fa-5eda-46e8-a057-95929ee8a63c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 13:24:02 crc kubenswrapper[4724]: I1002 13:24:02.799263 4724 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/189b80fa-5eda-46e8-a057-95929ee8a63c-kube-api-access-f8zwj" (OuterVolumeSpecName: "kube-api-access-f8zwj") pod "189b80fa-5eda-46e8-a057-95929ee8a63c" (UID: "189b80fa-5eda-46e8-a057-95929ee8a63c"). InnerVolumeSpecName "kube-api-access-f8zwj". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 13:24:02 crc kubenswrapper[4724]: I1002 13:24:02.837839 4724 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/189b80fa-5eda-46e8-a057-95929ee8a63c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "189b80fa-5eda-46e8-a057-95929ee8a63c" (UID: "189b80fa-5eda-46e8-a057-95929ee8a63c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 13:24:02 crc kubenswrapper[4724]: I1002 13:24:02.894070 4724 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/189b80fa-5eda-46e8-a057-95929ee8a63c-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 02 13:24:02 crc kubenswrapper[4724]: I1002 13:24:02.894109 4724 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f8zwj\" (UniqueName: \"kubernetes.io/projected/189b80fa-5eda-46e8-a057-95929ee8a63c-kube-api-access-f8zwj\") on node \"crc\" DevicePath \"\"" Oct 02 13:24:02 crc kubenswrapper[4724]: I1002 13:24:02.894121 4724 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/189b80fa-5eda-46e8-a057-95929ee8a63c-utilities\") on node \"crc\" DevicePath \"\"" Oct 02 13:24:03 crc kubenswrapper[4724]: I1002 13:24:03.261300 4724 generic.go:334] "Generic (PLEG): container finished" podID="189b80fa-5eda-46e8-a057-95929ee8a63c" containerID="2e4113505f676c9e26c2f370770fbe7deccc88633cb833dbca02209902756b89" exitCode=0 Oct 02 13:24:03 crc kubenswrapper[4724]: I1002 13:24:03.261348 4724 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-wcwqd" Oct 02 13:24:03 crc kubenswrapper[4724]: I1002 13:24:03.261368 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wcwqd" event={"ID":"189b80fa-5eda-46e8-a057-95929ee8a63c","Type":"ContainerDied","Data":"2e4113505f676c9e26c2f370770fbe7deccc88633cb833dbca02209902756b89"} Oct 02 13:24:03 crc kubenswrapper[4724]: I1002 13:24:03.262639 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wcwqd" event={"ID":"189b80fa-5eda-46e8-a057-95929ee8a63c","Type":"ContainerDied","Data":"17ee0ef2f81afe1988c2114b62b79b46e9f00b29c98a09f8b5f35a7808d05815"} Oct 02 13:24:03 crc kubenswrapper[4724]: I1002 13:24:03.262660 4724 scope.go:117] "RemoveContainer" containerID="2e4113505f676c9e26c2f370770fbe7deccc88633cb833dbca02209902756b89" Oct 02 13:24:03 crc kubenswrapper[4724]: I1002 13:24:03.279758 4724 scope.go:117] "RemoveContainer" containerID="d43e3326dbe7203704268195b749454662b9fb4b6f688a1ca73684170ed9f1cc" Oct 02 13:24:03 crc kubenswrapper[4724]: I1002 13:24:03.296977 4724 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-wcwqd"] Oct 02 13:24:03 crc kubenswrapper[4724]: I1002 13:24:03.302405 4724 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-wcwqd"] Oct 02 13:24:03 crc kubenswrapper[4724]: I1002 13:24:03.323261 4724 scope.go:117] "RemoveContainer" containerID="eda709ec96e805af926b7ea24a3f34f2457ad367deae5187a60d98ea58f49a50" Oct 02 13:24:03 crc kubenswrapper[4724]: I1002 13:24:03.339594 4724 scope.go:117] "RemoveContainer" containerID="2e4113505f676c9e26c2f370770fbe7deccc88633cb833dbca02209902756b89" Oct 02 13:24:03 crc kubenswrapper[4724]: E1002 13:24:03.340089 4724 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2e4113505f676c9e26c2f370770fbe7deccc88633cb833dbca02209902756b89\": container with ID starting with 2e4113505f676c9e26c2f370770fbe7deccc88633cb833dbca02209902756b89 not found: ID does not exist" containerID="2e4113505f676c9e26c2f370770fbe7deccc88633cb833dbca02209902756b89" Oct 02 13:24:03 crc kubenswrapper[4724]: I1002 13:24:03.340136 4724 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2e4113505f676c9e26c2f370770fbe7deccc88633cb833dbca02209902756b89"} err="failed to get container status \"2e4113505f676c9e26c2f370770fbe7deccc88633cb833dbca02209902756b89\": rpc error: code = NotFound desc = could not find container \"2e4113505f676c9e26c2f370770fbe7deccc88633cb833dbca02209902756b89\": container with ID starting with 2e4113505f676c9e26c2f370770fbe7deccc88633cb833dbca02209902756b89 not found: ID does not exist" Oct 02 13:24:03 crc kubenswrapper[4724]: I1002 13:24:03.340161 4724 scope.go:117] "RemoveContainer" containerID="d43e3326dbe7203704268195b749454662b9fb4b6f688a1ca73684170ed9f1cc" Oct 02 13:24:03 crc kubenswrapper[4724]: E1002 13:24:03.340711 4724 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d43e3326dbe7203704268195b749454662b9fb4b6f688a1ca73684170ed9f1cc\": container with ID starting with d43e3326dbe7203704268195b749454662b9fb4b6f688a1ca73684170ed9f1cc not found: ID does not exist" containerID="d43e3326dbe7203704268195b749454662b9fb4b6f688a1ca73684170ed9f1cc" Oct 02 13:24:03 crc kubenswrapper[4724]: I1002 13:24:03.340757 4724 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d43e3326dbe7203704268195b749454662b9fb4b6f688a1ca73684170ed9f1cc"} err="failed to get container status \"d43e3326dbe7203704268195b749454662b9fb4b6f688a1ca73684170ed9f1cc\": rpc error: code = NotFound desc = could not find container \"d43e3326dbe7203704268195b749454662b9fb4b6f688a1ca73684170ed9f1cc\": container with ID starting with d43e3326dbe7203704268195b749454662b9fb4b6f688a1ca73684170ed9f1cc not found: ID does not exist" Oct 02 13:24:03 crc kubenswrapper[4724]: I1002 13:24:03.340785 4724 scope.go:117] "RemoveContainer" containerID="eda709ec96e805af926b7ea24a3f34f2457ad367deae5187a60d98ea58f49a50" Oct 02 13:24:03 crc kubenswrapper[4724]: E1002 13:24:03.341048 4724 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"eda709ec96e805af926b7ea24a3f34f2457ad367deae5187a60d98ea58f49a50\": container with ID starting with eda709ec96e805af926b7ea24a3f34f2457ad367deae5187a60d98ea58f49a50 not found: ID does not exist" containerID="eda709ec96e805af926b7ea24a3f34f2457ad367deae5187a60d98ea58f49a50" Oct 02 13:24:03 crc kubenswrapper[4724]: I1002 13:24:03.341074 4724 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eda709ec96e805af926b7ea24a3f34f2457ad367deae5187a60d98ea58f49a50"} err="failed to get container status \"eda709ec96e805af926b7ea24a3f34f2457ad367deae5187a60d98ea58f49a50\": rpc error: code = NotFound desc = could not find container \"eda709ec96e805af926b7ea24a3f34f2457ad367deae5187a60d98ea58f49a50\": container with ID starting with eda709ec96e805af926b7ea24a3f34f2457ad367deae5187a60d98ea58f49a50 not found: ID does not exist" Oct 02 13:24:04 crc kubenswrapper[4724]: I1002 13:24:04.330031 4724 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="189b80fa-5eda-46e8-a057-95929ee8a63c" path="/var/lib/kubelet/pods/189b80fa-5eda-46e8-a057-95929ee8a63c/volumes" Oct 02 13:24:18 crc kubenswrapper[4724]: I1002 13:24:18.433497 4724 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-ct5lb"] Oct 02 13:24:18 crc kubenswrapper[4724]: E1002 13:24:18.435691 4724 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="189b80fa-5eda-46e8-a057-95929ee8a63c" containerName="extract-utilities" Oct 02 13:24:18 crc kubenswrapper[4724]: I1002 13:24:18.435857 4724 state_mem.go:107] "Deleted CPUSet assignment" podUID="189b80fa-5eda-46e8-a057-95929ee8a63c" containerName="extract-utilities" Oct 02 13:24:18 crc kubenswrapper[4724]: E1002 13:24:18.435931 4724 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="189b80fa-5eda-46e8-a057-95929ee8a63c" containerName="extract-content" Oct 02 13:24:18 crc kubenswrapper[4724]: I1002 13:24:18.435988 4724 state_mem.go:107] "Deleted CPUSet assignment" podUID="189b80fa-5eda-46e8-a057-95929ee8a63c" containerName="extract-content" Oct 02 13:24:18 crc kubenswrapper[4724]: E1002 13:24:18.436051 4724 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="189b80fa-5eda-46e8-a057-95929ee8a63c" containerName="registry-server" Oct 02 13:24:18 crc kubenswrapper[4724]: I1002 13:24:18.436127 4724 state_mem.go:107] "Deleted CPUSet assignment" podUID="189b80fa-5eda-46e8-a057-95929ee8a63c" containerName="registry-server" Oct 02 13:24:18 crc kubenswrapper[4724]: I1002 13:24:18.436667 4724 memory_manager.go:354] "RemoveStaleState removing state" podUID="189b80fa-5eda-46e8-a057-95929ee8a63c" containerName="registry-server" Oct 02 13:24:18 crc kubenswrapper[4724]: I1002 13:24:18.438005 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-ct5lb" Oct 02 13:24:18 crc kubenswrapper[4724]: I1002 13:24:18.440097 4724 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-ct5lb"] Oct 02 13:24:18 crc kubenswrapper[4724]: I1002 13:24:18.566927 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e4141d2e-bd7f-447d-8192-5d5ffa471157-catalog-content\") pod \"redhat-marketplace-ct5lb\" (UID: \"e4141d2e-bd7f-447d-8192-5d5ffa471157\") " pod="openshift-marketplace/redhat-marketplace-ct5lb" Oct 02 13:24:18 crc kubenswrapper[4724]: I1002 13:24:18.567279 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-chm4s\" (UniqueName: \"kubernetes.io/projected/e4141d2e-bd7f-447d-8192-5d5ffa471157-kube-api-access-chm4s\") pod \"redhat-marketplace-ct5lb\" (UID: \"e4141d2e-bd7f-447d-8192-5d5ffa471157\") " pod="openshift-marketplace/redhat-marketplace-ct5lb" Oct 02 13:24:18 crc kubenswrapper[4724]: I1002 13:24:18.567323 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e4141d2e-bd7f-447d-8192-5d5ffa471157-utilities\") pod \"redhat-marketplace-ct5lb\" (UID: \"e4141d2e-bd7f-447d-8192-5d5ffa471157\") " pod="openshift-marketplace/redhat-marketplace-ct5lb" Oct 02 13:24:18 crc kubenswrapper[4724]: I1002 13:24:18.669235 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e4141d2e-bd7f-447d-8192-5d5ffa471157-catalog-content\") pod \"redhat-marketplace-ct5lb\" (UID: \"e4141d2e-bd7f-447d-8192-5d5ffa471157\") " pod="openshift-marketplace/redhat-marketplace-ct5lb" Oct 02 13:24:18 crc kubenswrapper[4724]: I1002 13:24:18.669284 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-chm4s\" (UniqueName: \"kubernetes.io/projected/e4141d2e-bd7f-447d-8192-5d5ffa471157-kube-api-access-chm4s\") pod \"redhat-marketplace-ct5lb\" (UID: \"e4141d2e-bd7f-447d-8192-5d5ffa471157\") " pod="openshift-marketplace/redhat-marketplace-ct5lb" Oct 02 13:24:18 crc kubenswrapper[4724]: I1002 13:24:18.669308 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e4141d2e-bd7f-447d-8192-5d5ffa471157-utilities\") pod \"redhat-marketplace-ct5lb\" (UID: \"e4141d2e-bd7f-447d-8192-5d5ffa471157\") " pod="openshift-marketplace/redhat-marketplace-ct5lb" Oct 02 13:24:18 crc kubenswrapper[4724]: I1002 13:24:18.669752 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e4141d2e-bd7f-447d-8192-5d5ffa471157-catalog-content\") pod \"redhat-marketplace-ct5lb\" (UID: \"e4141d2e-bd7f-447d-8192-5d5ffa471157\") " pod="openshift-marketplace/redhat-marketplace-ct5lb" Oct 02 13:24:18 crc kubenswrapper[4724]: I1002 13:24:18.669874 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e4141d2e-bd7f-447d-8192-5d5ffa471157-utilities\") pod \"redhat-marketplace-ct5lb\" (UID: \"e4141d2e-bd7f-447d-8192-5d5ffa471157\") " pod="openshift-marketplace/redhat-marketplace-ct5lb" Oct 02 13:24:18 crc kubenswrapper[4724]: I1002 13:24:18.688195 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-chm4s\" (UniqueName: \"kubernetes.io/projected/e4141d2e-bd7f-447d-8192-5d5ffa471157-kube-api-access-chm4s\") pod \"redhat-marketplace-ct5lb\" (UID: \"e4141d2e-bd7f-447d-8192-5d5ffa471157\") " pod="openshift-marketplace/redhat-marketplace-ct5lb" Oct 02 13:24:18 crc kubenswrapper[4724]: I1002 13:24:18.758473 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-ct5lb" Oct 02 13:24:19 crc kubenswrapper[4724]: I1002 13:24:19.207960 4724 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-ct5lb"] Oct 02 13:24:19 crc kubenswrapper[4724]: I1002 13:24:19.398293 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ct5lb" event={"ID":"e4141d2e-bd7f-447d-8192-5d5ffa471157","Type":"ContainerStarted","Data":"3436160cf536e2d0bceee1565051aa417d99a346c4ab311e0111b010ff218e78"} Oct 02 13:24:20 crc kubenswrapper[4724]: I1002 13:24:20.405630 4724 generic.go:334] "Generic (PLEG): container finished" podID="e4141d2e-bd7f-447d-8192-5d5ffa471157" containerID="4e3e62b967cd6ac17e1852d9836820d9fbdafb8274447157daf2882c8102d799" exitCode=0 Oct 02 13:24:20 crc kubenswrapper[4724]: I1002 13:24:20.405747 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ct5lb" event={"ID":"e4141d2e-bd7f-447d-8192-5d5ffa471157","Type":"ContainerDied","Data":"4e3e62b967cd6ac17e1852d9836820d9fbdafb8274447157daf2882c8102d799"} Oct 02 13:24:22 crc kubenswrapper[4724]: I1002 13:24:22.422245 4724 generic.go:334] "Generic (PLEG): container finished" podID="e4141d2e-bd7f-447d-8192-5d5ffa471157" containerID="21ab2dec1ca61ce1a7b519f0ad4072572e9aed70f57cac8466f5d52e7ad09629" exitCode=0 Oct 02 13:24:22 crc kubenswrapper[4724]: I1002 13:24:22.422298 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ct5lb" event={"ID":"e4141d2e-bd7f-447d-8192-5d5ffa471157","Type":"ContainerDied","Data":"21ab2dec1ca61ce1a7b519f0ad4072572e9aed70f57cac8466f5d52e7ad09629"} Oct 02 13:24:23 crc kubenswrapper[4724]: I1002 13:24:23.432726 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ct5lb" event={"ID":"e4141d2e-bd7f-447d-8192-5d5ffa471157","Type":"ContainerStarted","Data":"8199dde637653eb8398a8626ccb9e0da00ed4fe9c9230360b3855909c77a5600"} Oct 02 13:24:23 crc kubenswrapper[4724]: I1002 13:24:23.454826 4724 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-ct5lb" podStartSLOduration=3.04476264 podStartE2EDuration="5.454799628s" podCreationTimestamp="2025-10-02 13:24:18 +0000 UTC" firstStartedPulling="2025-10-02 13:24:20.407263006 +0000 UTC m=+1524.862022127" lastFinishedPulling="2025-10-02 13:24:22.817299994 +0000 UTC m=+1527.272059115" observedRunningTime="2025-10-02 13:24:23.451224985 +0000 UTC m=+1527.905984116" watchObservedRunningTime="2025-10-02 13:24:23.454799628 +0000 UTC m=+1527.909558749" Oct 02 13:24:28 crc kubenswrapper[4724]: I1002 13:24:28.758779 4724 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-ct5lb" Oct 02 13:24:28 crc kubenswrapper[4724]: I1002 13:24:28.759350 4724 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-ct5lb" Oct 02 13:24:28 crc kubenswrapper[4724]: I1002 13:24:28.827368 4724 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-ct5lb" Oct 02 13:24:29 crc kubenswrapper[4724]: I1002 13:24:29.542414 4724 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-ct5lb" Oct 02 13:24:29 crc kubenswrapper[4724]: I1002 13:24:29.593790 4724 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-ct5lb"] Oct 02 13:24:31 crc kubenswrapper[4724]: I1002 13:24:31.499993 4724 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-ct5lb" podUID="e4141d2e-bd7f-447d-8192-5d5ffa471157" containerName="registry-server" containerID="cri-o://8199dde637653eb8398a8626ccb9e0da00ed4fe9c9230360b3855909c77a5600" gracePeriod=2 Oct 02 13:24:31 crc kubenswrapper[4724]: I1002 13:24:31.898850 4724 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-ct5lb" Oct 02 13:24:31 crc kubenswrapper[4724]: I1002 13:24:31.976593 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-chm4s\" (UniqueName: \"kubernetes.io/projected/e4141d2e-bd7f-447d-8192-5d5ffa471157-kube-api-access-chm4s\") pod \"e4141d2e-bd7f-447d-8192-5d5ffa471157\" (UID: \"e4141d2e-bd7f-447d-8192-5d5ffa471157\") " Oct 02 13:24:31 crc kubenswrapper[4724]: I1002 13:24:31.976679 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e4141d2e-bd7f-447d-8192-5d5ffa471157-utilities\") pod \"e4141d2e-bd7f-447d-8192-5d5ffa471157\" (UID: \"e4141d2e-bd7f-447d-8192-5d5ffa471157\") " Oct 02 13:24:31 crc kubenswrapper[4724]: I1002 13:24:31.976713 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e4141d2e-bd7f-447d-8192-5d5ffa471157-catalog-content\") pod \"e4141d2e-bd7f-447d-8192-5d5ffa471157\" (UID: \"e4141d2e-bd7f-447d-8192-5d5ffa471157\") " Oct 02 13:24:31 crc kubenswrapper[4724]: I1002 13:24:31.978118 4724 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e4141d2e-bd7f-447d-8192-5d5ffa471157-utilities" (OuterVolumeSpecName: "utilities") pod "e4141d2e-bd7f-447d-8192-5d5ffa471157" (UID: "e4141d2e-bd7f-447d-8192-5d5ffa471157"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 13:24:31 crc kubenswrapper[4724]: I1002 13:24:31.983360 4724 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e4141d2e-bd7f-447d-8192-5d5ffa471157-kube-api-access-chm4s" (OuterVolumeSpecName: "kube-api-access-chm4s") pod "e4141d2e-bd7f-447d-8192-5d5ffa471157" (UID: "e4141d2e-bd7f-447d-8192-5d5ffa471157"). InnerVolumeSpecName "kube-api-access-chm4s". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 13:24:31 crc kubenswrapper[4724]: I1002 13:24:31.989492 4724 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e4141d2e-bd7f-447d-8192-5d5ffa471157-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e4141d2e-bd7f-447d-8192-5d5ffa471157" (UID: "e4141d2e-bd7f-447d-8192-5d5ffa471157"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 13:24:32 crc kubenswrapper[4724]: I1002 13:24:32.079149 4724 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-chm4s\" (UniqueName: \"kubernetes.io/projected/e4141d2e-bd7f-447d-8192-5d5ffa471157-kube-api-access-chm4s\") on node \"crc\" DevicePath \"\"" Oct 02 13:24:32 crc kubenswrapper[4724]: I1002 13:24:32.079190 4724 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e4141d2e-bd7f-447d-8192-5d5ffa471157-utilities\") on node \"crc\" DevicePath \"\"" Oct 02 13:24:32 crc kubenswrapper[4724]: I1002 13:24:32.079205 4724 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e4141d2e-bd7f-447d-8192-5d5ffa471157-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 02 13:24:32 crc kubenswrapper[4724]: I1002 13:24:32.513915 4724 generic.go:334] "Generic (PLEG): container finished" podID="e4141d2e-bd7f-447d-8192-5d5ffa471157" containerID="8199dde637653eb8398a8626ccb9e0da00ed4fe9c9230360b3855909c77a5600" exitCode=0 Oct 02 13:24:32 crc kubenswrapper[4724]: I1002 13:24:32.513983 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ct5lb" event={"ID":"e4141d2e-bd7f-447d-8192-5d5ffa471157","Type":"ContainerDied","Data":"8199dde637653eb8398a8626ccb9e0da00ed4fe9c9230360b3855909c77a5600"} Oct 02 13:24:32 crc kubenswrapper[4724]: I1002 13:24:32.514051 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ct5lb" event={"ID":"e4141d2e-bd7f-447d-8192-5d5ffa471157","Type":"ContainerDied","Data":"3436160cf536e2d0bceee1565051aa417d99a346c4ab311e0111b010ff218e78"} Oct 02 13:24:32 crc kubenswrapper[4724]: I1002 13:24:32.514066 4724 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-ct5lb" Oct 02 13:24:32 crc kubenswrapper[4724]: I1002 13:24:32.514075 4724 scope.go:117] "RemoveContainer" containerID="8199dde637653eb8398a8626ccb9e0da00ed4fe9c9230360b3855909c77a5600" Oct 02 13:24:32 crc kubenswrapper[4724]: I1002 13:24:32.550707 4724 scope.go:117] "RemoveContainer" containerID="21ab2dec1ca61ce1a7b519f0ad4072572e9aed70f57cac8466f5d52e7ad09629" Oct 02 13:24:32 crc kubenswrapper[4724]: I1002 13:24:32.557731 4724 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-ct5lb"] Oct 02 13:24:32 crc kubenswrapper[4724]: I1002 13:24:32.568348 4724 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-ct5lb"] Oct 02 13:24:32 crc kubenswrapper[4724]: I1002 13:24:32.577768 4724 scope.go:117] "RemoveContainer" containerID="4e3e62b967cd6ac17e1852d9836820d9fbdafb8274447157daf2882c8102d799" Oct 02 13:24:32 crc kubenswrapper[4724]: I1002 13:24:32.603651 4724 scope.go:117] "RemoveContainer" containerID="8199dde637653eb8398a8626ccb9e0da00ed4fe9c9230360b3855909c77a5600" Oct 02 13:24:32 crc kubenswrapper[4724]: E1002 13:24:32.604097 4724 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8199dde637653eb8398a8626ccb9e0da00ed4fe9c9230360b3855909c77a5600\": container with ID starting with 8199dde637653eb8398a8626ccb9e0da00ed4fe9c9230360b3855909c77a5600 not found: ID does not exist" containerID="8199dde637653eb8398a8626ccb9e0da00ed4fe9c9230360b3855909c77a5600" Oct 02 13:24:32 crc kubenswrapper[4724]: I1002 13:24:32.604157 4724 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8199dde637653eb8398a8626ccb9e0da00ed4fe9c9230360b3855909c77a5600"} err="failed to get container status \"8199dde637653eb8398a8626ccb9e0da00ed4fe9c9230360b3855909c77a5600\": rpc error: code = NotFound desc = could not find container \"8199dde637653eb8398a8626ccb9e0da00ed4fe9c9230360b3855909c77a5600\": container with ID starting with 8199dde637653eb8398a8626ccb9e0da00ed4fe9c9230360b3855909c77a5600 not found: ID does not exist" Oct 02 13:24:32 crc kubenswrapper[4724]: I1002 13:24:32.604192 4724 scope.go:117] "RemoveContainer" containerID="21ab2dec1ca61ce1a7b519f0ad4072572e9aed70f57cac8466f5d52e7ad09629" Oct 02 13:24:32 crc kubenswrapper[4724]: E1002 13:24:32.604530 4724 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"21ab2dec1ca61ce1a7b519f0ad4072572e9aed70f57cac8466f5d52e7ad09629\": container with ID starting with 21ab2dec1ca61ce1a7b519f0ad4072572e9aed70f57cac8466f5d52e7ad09629 not found: ID does not exist" containerID="21ab2dec1ca61ce1a7b519f0ad4072572e9aed70f57cac8466f5d52e7ad09629" Oct 02 13:24:32 crc kubenswrapper[4724]: I1002 13:24:32.604584 4724 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"21ab2dec1ca61ce1a7b519f0ad4072572e9aed70f57cac8466f5d52e7ad09629"} err="failed to get container status \"21ab2dec1ca61ce1a7b519f0ad4072572e9aed70f57cac8466f5d52e7ad09629\": rpc error: code = NotFound desc = could not find container \"21ab2dec1ca61ce1a7b519f0ad4072572e9aed70f57cac8466f5d52e7ad09629\": container with ID starting with 21ab2dec1ca61ce1a7b519f0ad4072572e9aed70f57cac8466f5d52e7ad09629 not found: ID does not exist" Oct 02 13:24:32 crc kubenswrapper[4724]: I1002 13:24:32.604610 4724 scope.go:117] "RemoveContainer" containerID="4e3e62b967cd6ac17e1852d9836820d9fbdafb8274447157daf2882c8102d799" Oct 02 13:24:32 crc kubenswrapper[4724]: E1002 13:24:32.604923 4724 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4e3e62b967cd6ac17e1852d9836820d9fbdafb8274447157daf2882c8102d799\": container with ID starting with 4e3e62b967cd6ac17e1852d9836820d9fbdafb8274447157daf2882c8102d799 not found: ID does not exist" containerID="4e3e62b967cd6ac17e1852d9836820d9fbdafb8274447157daf2882c8102d799" Oct 02 13:24:32 crc kubenswrapper[4724]: I1002 13:24:32.604956 4724 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4e3e62b967cd6ac17e1852d9836820d9fbdafb8274447157daf2882c8102d799"} err="failed to get container status \"4e3e62b967cd6ac17e1852d9836820d9fbdafb8274447157daf2882c8102d799\": rpc error: code = NotFound desc = could not find container \"4e3e62b967cd6ac17e1852d9836820d9fbdafb8274447157daf2882c8102d799\": container with ID starting with 4e3e62b967cd6ac17e1852d9836820d9fbdafb8274447157daf2882c8102d799 not found: ID does not exist" Oct 02 13:24:34 crc kubenswrapper[4724]: I1002 13:24:34.326124 4724 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e4141d2e-bd7f-447d-8192-5d5ffa471157" path="/var/lib/kubelet/pods/e4141d2e-bd7f-447d-8192-5d5ffa471157/volumes" Oct 02 13:24:50 crc kubenswrapper[4724]: I1002 13:24:50.047078 4724 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/keystone-db-create-wz42b"] Oct 02 13:24:50 crc kubenswrapper[4724]: I1002 13:24:50.055007 4724 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/keystone-db-create-wz42b"] Oct 02 13:24:50 crc kubenswrapper[4724]: I1002 13:24:50.322897 4724 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="36c4f42f-bce9-409f-9cec-2f315dc87943" path="/var/lib/kubelet/pods/36c4f42f-bce9-409f-9cec-2f315dc87943/volumes" Oct 02 13:24:57 crc kubenswrapper[4724]: I1002 13:24:57.196521 4724 scope.go:117] "RemoveContainer" containerID="17ee2c8a8311d96e0001a4526d8cb38f53b7e0ee5a566195baf35e75544be84b" Oct 02 13:24:57 crc kubenswrapper[4724]: I1002 13:24:57.231025 4724 scope.go:117] "RemoveContainer" containerID="7eab315f1cda69b1bb2594ac4cdc239054b59199569e93f9475c0f12dc21df3b" Oct 02 13:24:57 crc kubenswrapper[4724]: I1002 13:24:57.283427 4724 scope.go:117] "RemoveContainer" containerID="0ea2f0515db05054b992d352e760b563d7394d0f342ef6f49397c4bc7e9e437d" Oct 02 13:24:57 crc kubenswrapper[4724]: I1002 13:24:57.316352 4724 scope.go:117] "RemoveContainer" containerID="204e46c1c7037466030eedd4eb34f20a40256857e4df30f98b58bc98a12fc1f7" Oct 02 13:24:57 crc kubenswrapper[4724]: I1002 13:24:57.369340 4724 scope.go:117] "RemoveContainer" containerID="ba9cacc4608bb53e03730519aefed01612414d7fb19cb85cab79a12b7368f1c9" Oct 02 13:25:00 crc kubenswrapper[4724]: I1002 13:25:00.035251 4724 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/keystone-4534-account-create-zpl5t"] Oct 02 13:25:00 crc kubenswrapper[4724]: I1002 13:25:00.045283 4724 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/keystone-4534-account-create-zpl5t"] Oct 02 13:25:00 crc kubenswrapper[4724]: I1002 13:25:00.326900 4724 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7c2ee54e-6822-42cb-940b-0734caf0ba50" path="/var/lib/kubelet/pods/7c2ee54e-6822-42cb-940b-0734caf0ba50/volumes" Oct 02 13:25:04 crc kubenswrapper[4724]: I1002 13:25:04.734295 4724 patch_prober.go:28] interesting pod/machine-config-daemon-74k4t container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 02 13:25:04 crc kubenswrapper[4724]: I1002 13:25:04.735661 4724 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-74k4t" podUID="f6090eaa-c182-4788-950c-16352c271233" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 02 13:25:16 crc kubenswrapper[4724]: I1002 13:25:16.027030 4724 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/keystone-db-sync-g8m6n"] Oct 02 13:25:16 crc kubenswrapper[4724]: I1002 13:25:16.032771 4724 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/keystone-db-sync-g8m6n"] Oct 02 13:25:16 crc kubenswrapper[4724]: I1002 13:25:16.322980 4724 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ed4b4c5-14ff-4cf7-bb9f-99d9e0326dcc" path="/var/lib/kubelet/pods/6ed4b4c5-14ff-4cf7-bb9f-99d9e0326dcc/volumes" Oct 02 13:25:26 crc kubenswrapper[4724]: I1002 13:25:26.056022 4724 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/keystone-bootstrap-b4gt5"] Oct 02 13:25:26 crc kubenswrapper[4724]: I1002 13:25:26.064749 4724 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/keystone-bootstrap-b4gt5"] Oct 02 13:25:26 crc kubenswrapper[4724]: I1002 13:25:26.325086 4724 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="740c0d97-42e5-421d-9def-352ddefb326c" path="/var/lib/kubelet/pods/740c0d97-42e5-421d-9def-352ddefb326c/volumes" Oct 02 13:25:34 crc kubenswrapper[4724]: I1002 13:25:34.733974 4724 patch_prober.go:28] interesting pod/machine-config-daemon-74k4t container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 02 13:25:34 crc kubenswrapper[4724]: I1002 13:25:34.734560 4724 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-74k4t" podUID="f6090eaa-c182-4788-950c-16352c271233" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 02 13:25:57 crc kubenswrapper[4724]: I1002 13:25:57.475946 4724 scope.go:117] "RemoveContainer" containerID="7a27c7c57714900861ce163e3d00ce5b99fff0180b7d17ffd6eda52f9c62eba5" Oct 02 13:25:57 crc kubenswrapper[4724]: I1002 13:25:57.505726 4724 scope.go:117] "RemoveContainer" containerID="82ebaf4fcfc4c9fdd8e051efb8e53b45a5c30f50b3a57245e4f64a180a5f7227" Oct 02 13:25:57 crc kubenswrapper[4724]: I1002 13:25:57.556838 4724 scope.go:117] "RemoveContainer" containerID="ab7229977a329097a0842af112f62d676f4c303e70b2dc7b43f9396323489818" Oct 02 13:25:57 crc kubenswrapper[4724]: I1002 13:25:57.573778 4724 scope.go:117] "RemoveContainer" containerID="d99e7398bd032c3cb19d91987704bc072ca77672dc6529cd6ca63f0c3ff22cdc" Oct 02 13:26:04 crc kubenswrapper[4724]: I1002 13:26:04.734240 4724 patch_prober.go:28] interesting pod/machine-config-daemon-74k4t container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 02 13:26:04 crc kubenswrapper[4724]: I1002 13:26:04.734930 4724 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-74k4t" podUID="f6090eaa-c182-4788-950c-16352c271233" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 02 13:26:04 crc kubenswrapper[4724]: I1002 13:26:04.734976 4724 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-74k4t" Oct 02 13:26:04 crc kubenswrapper[4724]: I1002 13:26:04.735643 4724 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"5cb607848e1db258b2539f214e6e2a3e1c03875bea1d097446a579c07b4f8d5a"} pod="openshift-machine-config-operator/machine-config-daemon-74k4t" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 02 13:26:04 crc kubenswrapper[4724]: I1002 13:26:04.735718 4724 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-74k4t" podUID="f6090eaa-c182-4788-950c-16352c271233" containerName="machine-config-daemon" containerID="cri-o://5cb607848e1db258b2539f214e6e2a3e1c03875bea1d097446a579c07b4f8d5a" gracePeriod=600 Oct 02 13:26:04 crc kubenswrapper[4724]: E1002 13:26:04.857183 4724 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-74k4t_openshift-machine-config-operator(f6090eaa-c182-4788-950c-16352c271233)\"" pod="openshift-machine-config-operator/machine-config-daemon-74k4t" podUID="f6090eaa-c182-4788-950c-16352c271233" Oct 02 13:26:05 crc kubenswrapper[4724]: I1002 13:26:05.277432 4724 generic.go:334] "Generic (PLEG): container finished" podID="f6090eaa-c182-4788-950c-16352c271233" containerID="5cb607848e1db258b2539f214e6e2a3e1c03875bea1d097446a579c07b4f8d5a" exitCode=0 Oct 02 13:26:05 crc kubenswrapper[4724]: I1002 13:26:05.277493 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-74k4t" event={"ID":"f6090eaa-c182-4788-950c-16352c271233","Type":"ContainerDied","Data":"5cb607848e1db258b2539f214e6e2a3e1c03875bea1d097446a579c07b4f8d5a"} Oct 02 13:26:05 crc kubenswrapper[4724]: I1002 13:26:05.277573 4724 scope.go:117] "RemoveContainer" containerID="6ce034e637cd5db41f6c20bcc63e0101e74c2ad03481f5a6d5b4f08ea38e8992" Oct 02 13:26:05 crc kubenswrapper[4724]: I1002 13:26:05.278292 4724 scope.go:117] "RemoveContainer" containerID="5cb607848e1db258b2539f214e6e2a3e1c03875bea1d097446a579c07b4f8d5a" Oct 02 13:26:05 crc kubenswrapper[4724]: E1002 13:26:05.278796 4724 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-74k4t_openshift-machine-config-operator(f6090eaa-c182-4788-950c-16352c271233)\"" pod="openshift-machine-config-operator/machine-config-daemon-74k4t" podUID="f6090eaa-c182-4788-950c-16352c271233" Oct 02 13:26:15 crc kubenswrapper[4724]: I1002 13:26:15.908865 4724 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/openstackclient"] Oct 02 13:26:15 crc kubenswrapper[4724]: E1002 13:26:15.909621 4724 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e4141d2e-bd7f-447d-8192-5d5ffa471157" containerName="extract-utilities" Oct 02 13:26:15 crc kubenswrapper[4724]: I1002 13:26:15.909634 4724 state_mem.go:107] "Deleted CPUSet assignment" podUID="e4141d2e-bd7f-447d-8192-5d5ffa471157" containerName="extract-utilities" Oct 02 13:26:15 crc kubenswrapper[4724]: E1002 13:26:15.909650 4724 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e4141d2e-bd7f-447d-8192-5d5ffa471157" containerName="registry-server" Oct 02 13:26:15 crc kubenswrapper[4724]: I1002 13:26:15.909656 4724 state_mem.go:107] "Deleted CPUSet assignment" podUID="e4141d2e-bd7f-447d-8192-5d5ffa471157" containerName="registry-server" Oct 02 13:26:15 crc kubenswrapper[4724]: E1002 13:26:15.909668 4724 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e4141d2e-bd7f-447d-8192-5d5ffa471157" containerName="extract-content" Oct 02 13:26:15 crc kubenswrapper[4724]: I1002 13:26:15.909674 4724 state_mem.go:107] "Deleted CPUSet assignment" podUID="e4141d2e-bd7f-447d-8192-5d5ffa471157" containerName="extract-content" Oct 02 13:26:15 crc kubenswrapper[4724]: I1002 13:26:15.909813 4724 memory_manager.go:354] "RemoveStaleState removing state" podUID="e4141d2e-bd7f-447d-8192-5d5ffa471157" containerName="registry-server" Oct 02 13:26:15 crc kubenswrapper[4724]: I1002 13:26:15.910287 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/openstackclient" Oct 02 13:26:15 crc kubenswrapper[4724]: I1002 13:26:15.916271 4724 reflector.go:368] Caches populated for *v1.ConfigMap from object-"glance-kuttl-tests"/"openstack-config" Oct 02 13:26:15 crc kubenswrapper[4724]: I1002 13:26:15.916512 4724 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"openstack-config-secret" Oct 02 13:26:15 crc kubenswrapper[4724]: I1002 13:26:15.916785 4724 reflector.go:368] Caches populated for *v1.ConfigMap from object-"glance-kuttl-tests"/"openstack-scripts-9db6gc427h" Oct 02 13:26:15 crc kubenswrapper[4724]: I1002 13:26:15.916945 4724 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"default-dockercfg-87nz2" Oct 02 13:26:15 crc kubenswrapper[4724]: I1002 13:26:15.917029 4724 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/openstackclient"] Oct 02 13:26:16 crc kubenswrapper[4724]: I1002 13:26:16.006155 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pj26h\" (UniqueName: \"kubernetes.io/projected/7be7e11e-a89b-474c-a1e6-e4d8b72e9b02-kube-api-access-pj26h\") pod \"openstackclient\" (UID: \"7be7e11e-a89b-474c-a1e6-e4d8b72e9b02\") " pod="glance-kuttl-tests/openstackclient" Oct 02 13:26:16 crc kubenswrapper[4724]: I1002 13:26:16.006339 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/7be7e11e-a89b-474c-a1e6-e4d8b72e9b02-openstack-config\") pod \"openstackclient\" (UID: \"7be7e11e-a89b-474c-a1e6-e4d8b72e9b02\") " pod="glance-kuttl-tests/openstackclient" Oct 02 13:26:16 crc kubenswrapper[4724]: I1002 13:26:16.006464 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-scripts\" (UniqueName: \"kubernetes.io/configmap/7be7e11e-a89b-474c-a1e6-e4d8b72e9b02-openstack-scripts\") pod \"openstackclient\" (UID: \"7be7e11e-a89b-474c-a1e6-e4d8b72e9b02\") " pod="glance-kuttl-tests/openstackclient" Oct 02 13:26:16 crc kubenswrapper[4724]: I1002 13:26:16.006495 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/7be7e11e-a89b-474c-a1e6-e4d8b72e9b02-openstack-config-secret\") pod \"openstackclient\" (UID: \"7be7e11e-a89b-474c-a1e6-e4d8b72e9b02\") " pod="glance-kuttl-tests/openstackclient" Oct 02 13:26:16 crc kubenswrapper[4724]: I1002 13:26:16.108251 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/7be7e11e-a89b-474c-a1e6-e4d8b72e9b02-openstack-config\") pod \"openstackclient\" (UID: \"7be7e11e-a89b-474c-a1e6-e4d8b72e9b02\") " pod="glance-kuttl-tests/openstackclient" Oct 02 13:26:16 crc kubenswrapper[4724]: I1002 13:26:16.108311 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/7be7e11e-a89b-474c-a1e6-e4d8b72e9b02-openstack-config-secret\") pod \"openstackclient\" (UID: \"7be7e11e-a89b-474c-a1e6-e4d8b72e9b02\") " pod="glance-kuttl-tests/openstackclient" Oct 02 13:26:16 crc kubenswrapper[4724]: I1002 13:26:16.108330 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-scripts\" (UniqueName: \"kubernetes.io/configmap/7be7e11e-a89b-474c-a1e6-e4d8b72e9b02-openstack-scripts\") pod \"openstackclient\" (UID: \"7be7e11e-a89b-474c-a1e6-e4d8b72e9b02\") " pod="glance-kuttl-tests/openstackclient" Oct 02 13:26:16 crc kubenswrapper[4724]: I1002 13:26:16.108390 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pj26h\" (UniqueName: \"kubernetes.io/projected/7be7e11e-a89b-474c-a1e6-e4d8b72e9b02-kube-api-access-pj26h\") pod \"openstackclient\" (UID: \"7be7e11e-a89b-474c-a1e6-e4d8b72e9b02\") " pod="glance-kuttl-tests/openstackclient" Oct 02 13:26:16 crc kubenswrapper[4724]: I1002 13:26:16.109413 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/7be7e11e-a89b-474c-a1e6-e4d8b72e9b02-openstack-config\") pod \"openstackclient\" (UID: \"7be7e11e-a89b-474c-a1e6-e4d8b72e9b02\") " pod="glance-kuttl-tests/openstackclient" Oct 02 13:26:16 crc kubenswrapper[4724]: I1002 13:26:16.110345 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-scripts\" (UniqueName: \"kubernetes.io/configmap/7be7e11e-a89b-474c-a1e6-e4d8b72e9b02-openstack-scripts\") pod \"openstackclient\" (UID: \"7be7e11e-a89b-474c-a1e6-e4d8b72e9b02\") " pod="glance-kuttl-tests/openstackclient" Oct 02 13:26:16 crc kubenswrapper[4724]: I1002 13:26:16.118053 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/7be7e11e-a89b-474c-a1e6-e4d8b72e9b02-openstack-config-secret\") pod \"openstackclient\" (UID: \"7be7e11e-a89b-474c-a1e6-e4d8b72e9b02\") " pod="glance-kuttl-tests/openstackclient" Oct 02 13:26:16 crc kubenswrapper[4724]: I1002 13:26:16.124339 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pj26h\" (UniqueName: \"kubernetes.io/projected/7be7e11e-a89b-474c-a1e6-e4d8b72e9b02-kube-api-access-pj26h\") pod \"openstackclient\" (UID: \"7be7e11e-a89b-474c-a1e6-e4d8b72e9b02\") " pod="glance-kuttl-tests/openstackclient" Oct 02 13:26:16 crc kubenswrapper[4724]: I1002 13:26:16.235778 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/openstackclient" Oct 02 13:26:16 crc kubenswrapper[4724]: I1002 13:26:16.741151 4724 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/openstackclient"] Oct 02 13:26:17 crc kubenswrapper[4724]: I1002 13:26:17.373174 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/openstackclient" event={"ID":"7be7e11e-a89b-474c-a1e6-e4d8b72e9b02","Type":"ContainerStarted","Data":"a47da0ecfd340ae4acee34a7ee8312674ac7b76957e7c2fbf6d83021fc2ad321"} Oct 02 13:26:17 crc kubenswrapper[4724]: I1002 13:26:17.374424 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/openstackclient" event={"ID":"7be7e11e-a89b-474c-a1e6-e4d8b72e9b02","Type":"ContainerStarted","Data":"46a169fd9332e1a3ce8301fb1c38dfdb052224e59ab830f9223ed9167577dbbf"} Oct 02 13:26:17 crc kubenswrapper[4724]: I1002 13:26:17.400388 4724 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="glance-kuttl-tests/openstackclient" podStartSLOduration=2.400371676 podStartE2EDuration="2.400371676s" podCreationTimestamp="2025-10-02 13:26:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 13:26:17.394494093 +0000 UTC m=+1641.849253214" watchObservedRunningTime="2025-10-02 13:26:17.400371676 +0000 UTC m=+1641.855130797" Oct 02 13:26:19 crc kubenswrapper[4724]: I1002 13:26:19.313609 4724 scope.go:117] "RemoveContainer" containerID="5cb607848e1db258b2539f214e6e2a3e1c03875bea1d097446a579c07b4f8d5a" Oct 02 13:26:19 crc kubenswrapper[4724]: E1002 13:26:19.314679 4724 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-74k4t_openshift-machine-config-operator(f6090eaa-c182-4788-950c-16352c271233)\"" pod="openshift-machine-config-operator/machine-config-daemon-74k4t" podUID="f6090eaa-c182-4788-950c-16352c271233" Oct 02 13:26:30 crc kubenswrapper[4724]: I1002 13:26:30.313786 4724 scope.go:117] "RemoveContainer" containerID="5cb607848e1db258b2539f214e6e2a3e1c03875bea1d097446a579c07b4f8d5a" Oct 02 13:26:30 crc kubenswrapper[4724]: E1002 13:26:30.314707 4724 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-74k4t_openshift-machine-config-operator(f6090eaa-c182-4788-950c-16352c271233)\"" pod="openshift-machine-config-operator/machine-config-daemon-74k4t" podUID="f6090eaa-c182-4788-950c-16352c271233" Oct 02 13:26:44 crc kubenswrapper[4724]: I1002 13:26:44.313450 4724 scope.go:117] "RemoveContainer" containerID="5cb607848e1db258b2539f214e6e2a3e1c03875bea1d097446a579c07b4f8d5a" Oct 02 13:26:44 crc kubenswrapper[4724]: E1002 13:26:44.314332 4724 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-74k4t_openshift-machine-config-operator(f6090eaa-c182-4788-950c-16352c271233)\"" pod="openshift-machine-config-operator/machine-config-daemon-74k4t" podUID="f6090eaa-c182-4788-950c-16352c271233" Oct 02 13:26:58 crc kubenswrapper[4724]: I1002 13:26:58.314584 4724 scope.go:117] "RemoveContainer" containerID="5cb607848e1db258b2539f214e6e2a3e1c03875bea1d097446a579c07b4f8d5a" Oct 02 13:26:58 crc kubenswrapper[4724]: E1002 13:26:58.315426 4724 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-74k4t_openshift-machine-config-operator(f6090eaa-c182-4788-950c-16352c271233)\"" pod="openshift-machine-config-operator/machine-config-daemon-74k4t" podUID="f6090eaa-c182-4788-950c-16352c271233" Oct 02 13:27:10 crc kubenswrapper[4724]: I1002 13:27:10.314419 4724 scope.go:117] "RemoveContainer" containerID="5cb607848e1db258b2539f214e6e2a3e1c03875bea1d097446a579c07b4f8d5a" Oct 02 13:27:10 crc kubenswrapper[4724]: E1002 13:27:10.315486 4724 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-74k4t_openshift-machine-config-operator(f6090eaa-c182-4788-950c-16352c271233)\"" pod="openshift-machine-config-operator/machine-config-daemon-74k4t" podUID="f6090eaa-c182-4788-950c-16352c271233" Oct 02 13:27:24 crc kubenswrapper[4724]: I1002 13:27:24.314385 4724 scope.go:117] "RemoveContainer" containerID="5cb607848e1db258b2539f214e6e2a3e1c03875bea1d097446a579c07b4f8d5a" Oct 02 13:27:24 crc kubenswrapper[4724]: E1002 13:27:24.315657 4724 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-74k4t_openshift-machine-config-operator(f6090eaa-c182-4788-950c-16352c271233)\"" pod="openshift-machine-config-operator/machine-config-daemon-74k4t" podUID="f6090eaa-c182-4788-950c-16352c271233" Oct 02 13:27:33 crc kubenswrapper[4724]: I1002 13:27:33.334659 4724 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-slsj8/must-gather-pqpxq"] Oct 02 13:27:33 crc kubenswrapper[4724]: I1002 13:27:33.337475 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-slsj8/must-gather-pqpxq" Oct 02 13:27:33 crc kubenswrapper[4724]: I1002 13:27:33.341386 4724 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-slsj8"/"kube-root-ca.crt" Oct 02 13:27:33 crc kubenswrapper[4724]: I1002 13:27:33.341710 4724 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-slsj8"/"openshift-service-ca.crt" Oct 02 13:27:33 crc kubenswrapper[4724]: I1002 13:27:33.371599 4724 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-slsj8/must-gather-pqpxq"] Oct 02 13:27:33 crc kubenswrapper[4724]: I1002 13:27:33.428234 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/e22842c3-d802-495a-9576-a187044f44d8-must-gather-output\") pod \"must-gather-pqpxq\" (UID: \"e22842c3-d802-495a-9576-a187044f44d8\") " pod="openshift-must-gather-slsj8/must-gather-pqpxq" Oct 02 13:27:33 crc kubenswrapper[4724]: I1002 13:27:33.428380 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mx4x2\" (UniqueName: \"kubernetes.io/projected/e22842c3-d802-495a-9576-a187044f44d8-kube-api-access-mx4x2\") pod \"must-gather-pqpxq\" (UID: \"e22842c3-d802-495a-9576-a187044f44d8\") " pod="openshift-must-gather-slsj8/must-gather-pqpxq" Oct 02 13:27:33 crc kubenswrapper[4724]: I1002 13:27:33.530421 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/e22842c3-d802-495a-9576-a187044f44d8-must-gather-output\") pod \"must-gather-pqpxq\" (UID: \"e22842c3-d802-495a-9576-a187044f44d8\") " pod="openshift-must-gather-slsj8/must-gather-pqpxq" Oct 02 13:27:33 crc kubenswrapper[4724]: I1002 13:27:33.530574 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mx4x2\" (UniqueName: \"kubernetes.io/projected/e22842c3-d802-495a-9576-a187044f44d8-kube-api-access-mx4x2\") pod \"must-gather-pqpxq\" (UID: \"e22842c3-d802-495a-9576-a187044f44d8\") " pod="openshift-must-gather-slsj8/must-gather-pqpxq" Oct 02 13:27:33 crc kubenswrapper[4724]: I1002 13:27:33.531093 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/e22842c3-d802-495a-9576-a187044f44d8-must-gather-output\") pod \"must-gather-pqpxq\" (UID: \"e22842c3-d802-495a-9576-a187044f44d8\") " pod="openshift-must-gather-slsj8/must-gather-pqpxq" Oct 02 13:27:33 crc kubenswrapper[4724]: I1002 13:27:33.569115 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mx4x2\" (UniqueName: \"kubernetes.io/projected/e22842c3-d802-495a-9576-a187044f44d8-kube-api-access-mx4x2\") pod \"must-gather-pqpxq\" (UID: \"e22842c3-d802-495a-9576-a187044f44d8\") " pod="openshift-must-gather-slsj8/must-gather-pqpxq" Oct 02 13:27:33 crc kubenswrapper[4724]: I1002 13:27:33.659803 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-slsj8/must-gather-pqpxq" Oct 02 13:27:34 crc kubenswrapper[4724]: I1002 13:27:34.179933 4724 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-slsj8/must-gather-pqpxq"] Oct 02 13:27:34 crc kubenswrapper[4724]: W1002 13:27:34.188129 4724 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode22842c3_d802_495a_9576_a187044f44d8.slice/crio-3801d012ba6727c881b4b19c559d52487d766be508ba9f96e66a7e0fb3344151 WatchSource:0}: Error finding container 3801d012ba6727c881b4b19c559d52487d766be508ba9f96e66a7e0fb3344151: Status 404 returned error can't find the container with id 3801d012ba6727c881b4b19c559d52487d766be508ba9f96e66a7e0fb3344151 Oct 02 13:27:35 crc kubenswrapper[4724]: I1002 13:27:35.003571 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-slsj8/must-gather-pqpxq" event={"ID":"e22842c3-d802-495a-9576-a187044f44d8","Type":"ContainerStarted","Data":"3801d012ba6727c881b4b19c559d52487d766be508ba9f96e66a7e0fb3344151"} Oct 02 13:27:35 crc kubenswrapper[4724]: I1002 13:27:35.313673 4724 scope.go:117] "RemoveContainer" containerID="5cb607848e1db258b2539f214e6e2a3e1c03875bea1d097446a579c07b4f8d5a" Oct 02 13:27:35 crc kubenswrapper[4724]: E1002 13:27:35.314078 4724 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-74k4t_openshift-machine-config-operator(f6090eaa-c182-4788-950c-16352c271233)\"" pod="openshift-machine-config-operator/machine-config-daemon-74k4t" podUID="f6090eaa-c182-4788-950c-16352c271233" Oct 02 13:27:42 crc kubenswrapper[4724]: I1002 13:27:42.064324 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-slsj8/must-gather-pqpxq" event={"ID":"e22842c3-d802-495a-9576-a187044f44d8","Type":"ContainerStarted","Data":"45479da7a027aff4dd548c4932ea02fb96c3203f4582eed4d0e33646dd56813b"} Oct 02 13:27:42 crc kubenswrapper[4724]: I1002 13:27:42.064934 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-slsj8/must-gather-pqpxq" event={"ID":"e22842c3-d802-495a-9576-a187044f44d8","Type":"ContainerStarted","Data":"fee9adebbd8e65566a16fa5a2e5a81aaa81ec9cd14e42657a02c6eeccafe89f4"} Oct 02 13:27:42 crc kubenswrapper[4724]: I1002 13:27:42.081182 4724 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-slsj8/must-gather-pqpxq" podStartSLOduration=2.22763254 podStartE2EDuration="9.08116319s" podCreationTimestamp="2025-10-02 13:27:33 +0000 UTC" firstStartedPulling="2025-10-02 13:27:34.191383802 +0000 UTC m=+1718.646142923" lastFinishedPulling="2025-10-02 13:27:41.044914452 +0000 UTC m=+1725.499673573" observedRunningTime="2025-10-02 13:27:42.081037927 +0000 UTC m=+1726.535797058" watchObservedRunningTime="2025-10-02 13:27:42.08116319 +0000 UTC m=+1726.535922311" Oct 02 13:27:48 crc kubenswrapper[4724]: I1002 13:27:48.314466 4724 scope.go:117] "RemoveContainer" containerID="5cb607848e1db258b2539f214e6e2a3e1c03875bea1d097446a579c07b4f8d5a" Oct 02 13:27:48 crc kubenswrapper[4724]: E1002 13:27:48.315239 4724 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-74k4t_openshift-machine-config-operator(f6090eaa-c182-4788-950c-16352c271233)\"" pod="openshift-machine-config-operator/machine-config-daemon-74k4t" podUID="f6090eaa-c182-4788-950c-16352c271233" Oct 02 13:28:01 crc kubenswrapper[4724]: I1002 13:28:01.314592 4724 scope.go:117] "RemoveContainer" containerID="5cb607848e1db258b2539f214e6e2a3e1c03875bea1d097446a579c07b4f8d5a" Oct 02 13:28:01 crc kubenswrapper[4724]: E1002 13:28:01.315901 4724 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-74k4t_openshift-machine-config-operator(f6090eaa-c182-4788-950c-16352c271233)\"" pod="openshift-machine-config-operator/machine-config-daemon-74k4t" podUID="f6090eaa-c182-4788-950c-16352c271233" Oct 02 13:28:15 crc kubenswrapper[4724]: I1002 13:28:15.313206 4724 scope.go:117] "RemoveContainer" containerID="5cb607848e1db258b2539f214e6e2a3e1c03875bea1d097446a579c07b4f8d5a" Oct 02 13:28:15 crc kubenswrapper[4724]: E1002 13:28:15.314067 4724 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-74k4t_openshift-machine-config-operator(f6090eaa-c182-4788-950c-16352c271233)\"" pod="openshift-machine-config-operator/machine-config-daemon-74k4t" podUID="f6090eaa-c182-4788-950c-16352c271233" Oct 02 13:28:18 crc kubenswrapper[4724]: I1002 13:28:18.486058 4724 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_24f71a75347ad7a5f841d153f802ac72a45ac2356b85254a24ab5c9f58jvk6g_0de57d48-755b-4e1c-a6f0-88e5cb02d827/util/0.log" Oct 02 13:28:18 crc kubenswrapper[4724]: I1002 13:28:18.650142 4724 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_24f71a75347ad7a5f841d153f802ac72a45ac2356b85254a24ab5c9f58jvk6g_0de57d48-755b-4e1c-a6f0-88e5cb02d827/util/0.log" Oct 02 13:28:18 crc kubenswrapper[4724]: I1002 13:28:18.678515 4724 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_24f71a75347ad7a5f841d153f802ac72a45ac2356b85254a24ab5c9f58jvk6g_0de57d48-755b-4e1c-a6f0-88e5cb02d827/pull/0.log" Oct 02 13:28:18 crc kubenswrapper[4724]: I1002 13:28:18.678794 4724 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_24f71a75347ad7a5f841d153f802ac72a45ac2356b85254a24ab5c9f58jvk6g_0de57d48-755b-4e1c-a6f0-88e5cb02d827/pull/0.log" Oct 02 13:28:18 crc kubenswrapper[4724]: I1002 13:28:18.907465 4724 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_24f71a75347ad7a5f841d153f802ac72a45ac2356b85254a24ab5c9f58jvk6g_0de57d48-755b-4e1c-a6f0-88e5cb02d827/pull/0.log" Oct 02 13:28:18 crc kubenswrapper[4724]: I1002 13:28:18.926570 4724 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_24f71a75347ad7a5f841d153f802ac72a45ac2356b85254a24ab5c9f58jvk6g_0de57d48-755b-4e1c-a6f0-88e5cb02d827/util/0.log" Oct 02 13:28:18 crc kubenswrapper[4724]: I1002 13:28:18.927909 4724 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_24f71a75347ad7a5f841d153f802ac72a45ac2356b85254a24ab5c9f58jvk6g_0de57d48-755b-4e1c-a6f0-88e5cb02d827/extract/0.log" Oct 02 13:28:19 crc kubenswrapper[4724]: I1002 13:28:19.106105 4724 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_5bfb89b7a15e902ec1ce651098a1cbdcb0a2281c38e30d9a342b952813zcwwv_cec254d8-8ae9-44e7-b7f8-40a87a42ca6c/util/0.log" Oct 02 13:28:19 crc kubenswrapper[4724]: I1002 13:28:19.288202 4724 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_5bfb89b7a15e902ec1ce651098a1cbdcb0a2281c38e30d9a342b952813zcwwv_cec254d8-8ae9-44e7-b7f8-40a87a42ca6c/pull/0.log" Oct 02 13:28:19 crc kubenswrapper[4724]: I1002 13:28:19.326925 4724 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_5bfb89b7a15e902ec1ce651098a1cbdcb0a2281c38e30d9a342b952813zcwwv_cec254d8-8ae9-44e7-b7f8-40a87a42ca6c/pull/0.log" Oct 02 13:28:19 crc kubenswrapper[4724]: I1002 13:28:19.453561 4724 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_5bfb89b7a15e902ec1ce651098a1cbdcb0a2281c38e30d9a342b952813zcwwv_cec254d8-8ae9-44e7-b7f8-40a87a42ca6c/util/0.log" Oct 02 13:28:19 crc kubenswrapper[4724]: I1002 13:28:19.683202 4724 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_5bfb89b7a15e902ec1ce651098a1cbdcb0a2281c38e30d9a342b952813zcwwv_cec254d8-8ae9-44e7-b7f8-40a87a42ca6c/pull/0.log" Oct 02 13:28:19 crc kubenswrapper[4724]: I1002 13:28:19.689173 4724 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_5bfb89b7a15e902ec1ce651098a1cbdcb0a2281c38e30d9a342b952813zcwwv_cec254d8-8ae9-44e7-b7f8-40a87a42ca6c/extract/0.log" Oct 02 13:28:19 crc kubenswrapper[4724]: I1002 13:28:19.696997 4724 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_5bfb89b7a15e902ec1ce651098a1cbdcb0a2281c38e30d9a342b952813zcwwv_cec254d8-8ae9-44e7-b7f8-40a87a42ca6c/util/0.log" Oct 02 13:28:19 crc kubenswrapper[4724]: I1002 13:28:19.888777 4724 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e5907hp2b_0c83815e-5c93-45f4-9a44-48bb3e26f9c1/util/0.log" Oct 02 13:28:20 crc kubenswrapper[4724]: I1002 13:28:20.104270 4724 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e5907hp2b_0c83815e-5c93-45f4-9a44-48bb3e26f9c1/util/0.log" Oct 02 13:28:20 crc kubenswrapper[4724]: I1002 13:28:20.105553 4724 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e5907hp2b_0c83815e-5c93-45f4-9a44-48bb3e26f9c1/pull/0.log" Oct 02 13:28:20 crc kubenswrapper[4724]: I1002 13:28:20.108387 4724 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e5907hp2b_0c83815e-5c93-45f4-9a44-48bb3e26f9c1/pull/0.log" Oct 02 13:28:20 crc kubenswrapper[4724]: I1002 13:28:20.301848 4724 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e5907hp2b_0c83815e-5c93-45f4-9a44-48bb3e26f9c1/util/0.log" Oct 02 13:28:20 crc kubenswrapper[4724]: I1002 13:28:20.306194 4724 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e5907hp2b_0c83815e-5c93-45f4-9a44-48bb3e26f9c1/pull/0.log" Oct 02 13:28:20 crc kubenswrapper[4724]: I1002 13:28:20.345846 4724 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e5907hp2b_0c83815e-5c93-45f4-9a44-48bb3e26f9c1/extract/0.log" Oct 02 13:28:20 crc kubenswrapper[4724]: I1002 13:28:20.497416 4724 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ac8ca47b6b256283a3e13fdd5b2e3854465d68c59374f2edea232a7cb3nvnvh_51f95db5-8f9f-449c-8bc7-04ebf10c4f97/util/0.log" Oct 02 13:28:20 crc kubenswrapper[4724]: I1002 13:28:20.682477 4724 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ac8ca47b6b256283a3e13fdd5b2e3854465d68c59374f2edea232a7cb3nvnvh_51f95db5-8f9f-449c-8bc7-04ebf10c4f97/pull/0.log" Oct 02 13:28:20 crc kubenswrapper[4724]: I1002 13:28:20.709236 4724 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ac8ca47b6b256283a3e13fdd5b2e3854465d68c59374f2edea232a7cb3nvnvh_51f95db5-8f9f-449c-8bc7-04ebf10c4f97/util/0.log" Oct 02 13:28:20 crc kubenswrapper[4724]: I1002 13:28:20.716147 4724 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ac8ca47b6b256283a3e13fdd5b2e3854465d68c59374f2edea232a7cb3nvnvh_51f95db5-8f9f-449c-8bc7-04ebf10c4f97/pull/0.log" Oct 02 13:28:20 crc kubenswrapper[4724]: I1002 13:28:20.853331 4724 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ac8ca47b6b256283a3e13fdd5b2e3854465d68c59374f2edea232a7cb3nvnvh_51f95db5-8f9f-449c-8bc7-04ebf10c4f97/pull/0.log" Oct 02 13:28:20 crc kubenswrapper[4724]: I1002 13:28:20.879490 4724 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ac8ca47b6b256283a3e13fdd5b2e3854465d68c59374f2edea232a7cb3nvnvh_51f95db5-8f9f-449c-8bc7-04ebf10c4f97/util/0.log" Oct 02 13:28:20 crc kubenswrapper[4724]: I1002 13:28:20.895796 4724 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ac8ca47b6b256283a3e13fdd5b2e3854465d68c59374f2edea232a7cb3nvnvh_51f95db5-8f9f-449c-8bc7-04ebf10c4f97/extract/0.log" Oct 02 13:28:20 crc kubenswrapper[4724]: I1002 13:28:20.989323 4724 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_c77910536a79801a83f49d4fd4581e5a2972791dfc31ed0ea9f0ffea32fsqpq_18d91c41-08cd-4174-b468-54e2142c767e/util/0.log" Oct 02 13:28:21 crc kubenswrapper[4724]: I1002 13:28:21.152630 4724 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_c77910536a79801a83f49d4fd4581e5a2972791dfc31ed0ea9f0ffea32fsqpq_18d91c41-08cd-4174-b468-54e2142c767e/util/0.log" Oct 02 13:28:21 crc kubenswrapper[4724]: I1002 13:28:21.156996 4724 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_c77910536a79801a83f49d4fd4581e5a2972791dfc31ed0ea9f0ffea32fsqpq_18d91c41-08cd-4174-b468-54e2142c767e/pull/0.log" Oct 02 13:28:21 crc kubenswrapper[4724]: I1002 13:28:21.157033 4724 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_c77910536a79801a83f49d4fd4581e5a2972791dfc31ed0ea9f0ffea32fsqpq_18d91c41-08cd-4174-b468-54e2142c767e/pull/0.log" Oct 02 13:28:21 crc kubenswrapper[4724]: I1002 13:28:21.338368 4724 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_c77910536a79801a83f49d4fd4581e5a2972791dfc31ed0ea9f0ffea32fsqpq_18d91c41-08cd-4174-b468-54e2142c767e/extract/0.log" Oct 02 13:28:21 crc kubenswrapper[4724]: I1002 13:28:21.353137 4724 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_c77910536a79801a83f49d4fd4581e5a2972791dfc31ed0ea9f0ffea32fsqpq_18d91c41-08cd-4174-b468-54e2142c767e/pull/0.log" Oct 02 13:28:21 crc kubenswrapper[4724]: I1002 13:28:21.353794 4724 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_c77910536a79801a83f49d4fd4581e5a2972791dfc31ed0ea9f0ffea32fsqpq_18d91c41-08cd-4174-b468-54e2142c767e/util/0.log" Oct 02 13:28:21 crc kubenswrapper[4724]: I1002 13:28:21.384654 4724 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_d93b99dddc714b0f4b2148f40016b9ead21cc18743d58ffe812e1bd436nj67f_c72227ad-ec88-4c28-b96e-989d90f420e8/util/0.log" Oct 02 13:28:21 crc kubenswrapper[4724]: I1002 13:28:21.560497 4724 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_d93b99dddc714b0f4b2148f40016b9ead21cc18743d58ffe812e1bd436nj67f_c72227ad-ec88-4c28-b96e-989d90f420e8/pull/0.log" Oct 02 13:28:21 crc kubenswrapper[4724]: I1002 13:28:21.592622 4724 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_d93b99dddc714b0f4b2148f40016b9ead21cc18743d58ffe812e1bd436nj67f_c72227ad-ec88-4c28-b96e-989d90f420e8/util/0.log" Oct 02 13:28:21 crc kubenswrapper[4724]: I1002 13:28:21.599883 4724 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_d93b99dddc714b0f4b2148f40016b9ead21cc18743d58ffe812e1bd436nj67f_c72227ad-ec88-4c28-b96e-989d90f420e8/pull/0.log" Oct 02 13:28:21 crc kubenswrapper[4724]: I1002 13:28:21.734918 4724 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_d93b99dddc714b0f4b2148f40016b9ead21cc18743d58ffe812e1bd436nj67f_c72227ad-ec88-4c28-b96e-989d90f420e8/util/0.log" Oct 02 13:28:21 crc kubenswrapper[4724]: I1002 13:28:21.756806 4724 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_d93b99dddc714b0f4b2148f40016b9ead21cc18743d58ffe812e1bd436nj67f_c72227ad-ec88-4c28-b96e-989d90f420e8/extract/0.log" Oct 02 13:28:21 crc kubenswrapper[4724]: I1002 13:28:21.798979 4724 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_d93b99dddc714b0f4b2148f40016b9ead21cc18743d58ffe812e1bd436nj67f_c72227ad-ec88-4c28-b96e-989d90f420e8/pull/0.log" Oct 02 13:28:21 crc kubenswrapper[4724]: I1002 13:28:21.849509 4724 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ec20a04ef7278338c96ca90950ec47944973b8553e1da5c6f2ce730402cv6v2_61201eac-406d-4bed-a59f-aa4fe87eebca/util/0.log" Oct 02 13:28:22 crc kubenswrapper[4724]: I1002 13:28:22.004879 4724 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ec20a04ef7278338c96ca90950ec47944973b8553e1da5c6f2ce730402cv6v2_61201eac-406d-4bed-a59f-aa4fe87eebca/pull/0.log" Oct 02 13:28:22 crc kubenswrapper[4724]: I1002 13:28:22.018616 4724 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ec20a04ef7278338c96ca90950ec47944973b8553e1da5c6f2ce730402cv6v2_61201eac-406d-4bed-a59f-aa4fe87eebca/pull/0.log" Oct 02 13:28:22 crc kubenswrapper[4724]: I1002 13:28:22.022182 4724 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ec20a04ef7278338c96ca90950ec47944973b8553e1da5c6f2ce730402cv6v2_61201eac-406d-4bed-a59f-aa4fe87eebca/util/0.log" Oct 02 13:28:22 crc kubenswrapper[4724]: I1002 13:28:22.198810 4724 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ec20a04ef7278338c96ca90950ec47944973b8553e1da5c6f2ce730402cv6v2_61201eac-406d-4bed-a59f-aa4fe87eebca/extract/0.log" Oct 02 13:28:22 crc kubenswrapper[4724]: I1002 13:28:22.262923 4724 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ec20a04ef7278338c96ca90950ec47944973b8553e1da5c6f2ce730402cv6v2_61201eac-406d-4bed-a59f-aa4fe87eebca/pull/0.log" Oct 02 13:28:22 crc kubenswrapper[4724]: I1002 13:28:22.265661 4724 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-7d4448d4c-tcwsh_632aef42-b937-4494-bf3a-7251fb7fb975/kube-rbac-proxy/0.log" Oct 02 13:28:22 crc kubenswrapper[4724]: I1002 13:28:22.280994 4724 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ec20a04ef7278338c96ca90950ec47944973b8553e1da5c6f2ce730402cv6v2_61201eac-406d-4bed-a59f-aa4fe87eebca/util/0.log" Oct 02 13:28:22 crc kubenswrapper[4724]: I1002 13:28:22.441523 4724 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-index-d8sdv_4520d4cc-dd9a-4dd9-b506-f41c6fc537ed/registry-server/0.log" Oct 02 13:28:22 crc kubenswrapper[4724]: I1002 13:28:22.470781 4724 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-7d4448d4c-tcwsh_632aef42-b937-4494-bf3a-7251fb7fb975/manager/0.log" Oct 02 13:28:22 crc kubenswrapper[4724]: I1002 13:28:22.606822 4724 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-55ffbdd8b6-6mvsc_2189dcc9-69ab-445f-83a7-2491a7ecb038/manager/0.log" Oct 02 13:28:22 crc kubenswrapper[4724]: I1002 13:28:22.615327 4724 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-55ffbdd8b6-6mvsc_2189dcc9-69ab-445f-83a7-2491a7ecb038/kube-rbac-proxy/0.log" Oct 02 13:28:22 crc kubenswrapper[4724]: I1002 13:28:22.715572 4724 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-index-v5l8v_7f4f2819-785c-4e86-9ed7-21e0b606a214/registry-server/0.log" Oct 02 13:28:22 crc kubenswrapper[4724]: I1002 13:28:22.792478 4724 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-75d774d5cf-hbbvj_d5f7ea4a-c39f-4e03-bb32-0ef1cd9f322e/kube-rbac-proxy/0.log" Oct 02 13:28:22 crc kubenswrapper[4724]: I1002 13:28:22.807872 4724 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-75d774d5cf-hbbvj_d5f7ea4a-c39f-4e03-bb32-0ef1cd9f322e/manager/0.log" Oct 02 13:28:22 crc kubenswrapper[4724]: I1002 13:28:22.892339 4724 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-index-5h4lj_e19f04a2-01f2-43d1-b30c-9a0b2e9662b1/registry-server/0.log" Oct 02 13:28:22 crc kubenswrapper[4724]: I1002 13:28:22.992219 4724 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-c48d5fbd5-pt768_b1ea0368-8cba-4ff8-bb45-2c1b377b3a2d/kube-rbac-proxy/0.log" Oct 02 13:28:23 crc kubenswrapper[4724]: I1002 13:28:23.055340 4724 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-c48d5fbd5-pt768_b1ea0368-8cba-4ff8-bb45-2c1b377b3a2d/manager/0.log" Oct 02 13:28:23 crc kubenswrapper[4724]: I1002 13:28:23.135110 4724 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-index-6ctwc_f7081f4f-6b12-4a89-b7bc-e24117cbf951/registry-server/0.log" Oct 02 13:28:23 crc kubenswrapper[4724]: I1002 13:28:23.218488 4724 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-7cdf88d46-nvsmc_58c544da-0083-4968-a2bd-75944651e5ea/kube-rbac-proxy/0.log" Oct 02 13:28:23 crc kubenswrapper[4724]: I1002 13:28:23.236587 4724 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-7cdf88d46-nvsmc_58c544da-0083-4968-a2bd-75944651e5ea/manager/0.log" Oct 02 13:28:23 crc kubenswrapper[4724]: I1002 13:28:23.355700 4724 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-index-4st9w_29c00177-6564-4e27-a1de-8f60e8dfc89c/registry-server/0.log" Oct 02 13:28:23 crc kubenswrapper[4724]: I1002 13:28:23.416123 4724 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_rabbitmq-cluster-operator-779fc9694b-nk8kk_7d397c20-cea7-4b10-8861-1b1da35bcc17/operator/0.log" Oct 02 13:28:23 crc kubenswrapper[4724]: I1002 13:28:23.562285 4724 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_rabbitmq-cluster-operator-index-jxtww_962102f1-5774-4ad9-988f-c1cd36d66caa/registry-server/0.log" Oct 02 13:28:23 crc kubenswrapper[4724]: I1002 13:28:23.577683 4724 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-5947468b68-64ngp_9e2ec7b8-85ef-400d-ac94-39a733e729aa/kube-rbac-proxy/0.log" Oct 02 13:28:23 crc kubenswrapper[4724]: I1002 13:28:23.620189 4724 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-5947468b68-64ngp_9e2ec7b8-85ef-400d-ac94-39a733e729aa/manager/0.log" Oct 02 13:28:23 crc kubenswrapper[4724]: I1002 13:28:23.788186 4724 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-index-6wl8t_bea86036-f472-4a34-bc7a-dc5ccfa9dc77/registry-server/0.log" Oct 02 13:28:29 crc kubenswrapper[4724]: I1002 13:28:29.314685 4724 scope.go:117] "RemoveContainer" containerID="5cb607848e1db258b2539f214e6e2a3e1c03875bea1d097446a579c07b4f8d5a" Oct 02 13:28:29 crc kubenswrapper[4724]: E1002 13:28:29.315318 4724 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-74k4t_openshift-machine-config-operator(f6090eaa-c182-4788-950c-16352c271233)\"" pod="openshift-machine-config-operator/machine-config-daemon-74k4t" podUID="f6090eaa-c182-4788-950c-16352c271233" Oct 02 13:28:37 crc kubenswrapper[4724]: I1002 13:28:37.852521 4724 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-wjrm6_efa1155d-cd1a-496d-94e8-eecbee129061/control-plane-machine-set-operator/0.log" Oct 02 13:28:37 crc kubenswrapper[4724]: I1002 13:28:37.991912 4724 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-x726v_27c39377-9fc1-4ca9-8ce4-8a1c61f181c0/kube-rbac-proxy/0.log" Oct 02 13:28:38 crc kubenswrapper[4724]: I1002 13:28:38.070323 4724 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-x726v_27c39377-9fc1-4ca9-8ce4-8a1c61f181c0/machine-api-operator/0.log" Oct 02 13:28:41 crc kubenswrapper[4724]: I1002 13:28:41.314082 4724 scope.go:117] "RemoveContainer" containerID="5cb607848e1db258b2539f214e6e2a3e1c03875bea1d097446a579c07b4f8d5a" Oct 02 13:28:41 crc kubenswrapper[4724]: E1002 13:28:41.314874 4724 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-74k4t_openshift-machine-config-operator(f6090eaa-c182-4788-950c-16352c271233)\"" pod="openshift-machine-config-operator/machine-config-daemon-74k4t" podUID="f6090eaa-c182-4788-950c-16352c271233" Oct 02 13:28:53 crc kubenswrapper[4724]: I1002 13:28:53.313315 4724 scope.go:117] "RemoveContainer" containerID="5cb607848e1db258b2539f214e6e2a3e1c03875bea1d097446a579c07b4f8d5a" Oct 02 13:28:53 crc kubenswrapper[4724]: E1002 13:28:53.314087 4724 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-74k4t_openshift-machine-config-operator(f6090eaa-c182-4788-950c-16352c271233)\"" pod="openshift-machine-config-operator/machine-config-daemon-74k4t" podUID="f6090eaa-c182-4788-950c-16352c271233" Oct 02 13:28:54 crc kubenswrapper[4724]: I1002 13:28:54.431031 4724 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-68d546b9d8-lcf4r_5fd4062f-1d62-422b-8190-b3392d13b74e/controller/0.log" Oct 02 13:28:54 crc kubenswrapper[4724]: I1002 13:28:54.464974 4724 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-68d546b9d8-lcf4r_5fd4062f-1d62-422b-8190-b3392d13b74e/kube-rbac-proxy/0.log" Oct 02 13:28:54 crc kubenswrapper[4724]: I1002 13:28:54.589738 4724 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-hrpd4_4d5fd30f-d080-4759-b828-d40f2293c6c7/cp-frr-files/0.log" Oct 02 13:28:54 crc kubenswrapper[4724]: I1002 13:28:54.762380 4724 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-hrpd4_4d5fd30f-d080-4759-b828-d40f2293c6c7/cp-frr-files/0.log" Oct 02 13:28:54 crc kubenswrapper[4724]: I1002 13:28:54.762382 4724 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-hrpd4_4d5fd30f-d080-4759-b828-d40f2293c6c7/cp-reloader/0.log" Oct 02 13:28:54 crc kubenswrapper[4724]: I1002 13:28:54.768515 4724 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-hrpd4_4d5fd30f-d080-4759-b828-d40f2293c6c7/cp-reloader/0.log" Oct 02 13:28:54 crc kubenswrapper[4724]: I1002 13:28:54.775245 4724 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-hrpd4_4d5fd30f-d080-4759-b828-d40f2293c6c7/cp-metrics/0.log" Oct 02 13:28:54 crc kubenswrapper[4724]: I1002 13:28:54.954214 4724 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-hrpd4_4d5fd30f-d080-4759-b828-d40f2293c6c7/cp-reloader/0.log" Oct 02 13:28:54 crc kubenswrapper[4724]: I1002 13:28:54.967942 4724 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-hrpd4_4d5fd30f-d080-4759-b828-d40f2293c6c7/cp-metrics/0.log" Oct 02 13:28:54 crc kubenswrapper[4724]: I1002 13:28:54.994453 4724 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-hrpd4_4d5fd30f-d080-4759-b828-d40f2293c6c7/cp-frr-files/0.log" Oct 02 13:28:55 crc kubenswrapper[4724]: I1002 13:28:55.000224 4724 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-hrpd4_4d5fd30f-d080-4759-b828-d40f2293c6c7/cp-metrics/0.log" Oct 02 13:28:55 crc kubenswrapper[4724]: I1002 13:28:55.191914 4724 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-hrpd4_4d5fd30f-d080-4759-b828-d40f2293c6c7/cp-reloader/0.log" Oct 02 13:28:55 crc kubenswrapper[4724]: I1002 13:28:55.191928 4724 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-hrpd4_4d5fd30f-d080-4759-b828-d40f2293c6c7/cp-frr-files/0.log" Oct 02 13:28:55 crc kubenswrapper[4724]: I1002 13:28:55.192253 4724 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-hrpd4_4d5fd30f-d080-4759-b828-d40f2293c6c7/cp-metrics/0.log" Oct 02 13:28:55 crc kubenswrapper[4724]: I1002 13:28:55.206681 4724 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-hrpd4_4d5fd30f-d080-4759-b828-d40f2293c6c7/controller/0.log" Oct 02 13:28:55 crc kubenswrapper[4724]: I1002 13:28:55.361289 4724 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-hrpd4_4d5fd30f-d080-4759-b828-d40f2293c6c7/frr-metrics/0.log" Oct 02 13:28:55 crc kubenswrapper[4724]: I1002 13:28:55.419694 4724 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-hrpd4_4d5fd30f-d080-4759-b828-d40f2293c6c7/kube-rbac-proxy-frr/0.log" Oct 02 13:28:55 crc kubenswrapper[4724]: I1002 13:28:55.426115 4724 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-hrpd4_4d5fd30f-d080-4759-b828-d40f2293c6c7/kube-rbac-proxy/0.log" Oct 02 13:28:55 crc kubenswrapper[4724]: I1002 13:28:55.610665 4724 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-hrpd4_4d5fd30f-d080-4759-b828-d40f2293c6c7/reloader/0.log" Oct 02 13:28:55 crc kubenswrapper[4724]: I1002 13:28:55.674839 4724 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-64bf5d555-sg7xz_174f90c3-3227-45c0-b74f-541b539be8d5/frr-k8s-webhook-server/0.log" Oct 02 13:28:55 crc kubenswrapper[4724]: I1002 13:28:55.900629 4724 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-54bb9cccbc-82d7b_9eb7bd6b-79d0-4b9d-876d-36b22622162d/manager/0.log" Oct 02 13:28:55 crc kubenswrapper[4724]: I1002 13:28:55.928187 4724 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-hrpd4_4d5fd30f-d080-4759-b828-d40f2293c6c7/frr/0.log" Oct 02 13:28:56 crc kubenswrapper[4724]: I1002 13:28:56.065139 4724 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-57f56bd847-ffkcc_803bc5d2-e4fe-49e8-abd0-737bd3ccc2f3/webhook-server/0.log" Oct 02 13:28:56 crc kubenswrapper[4724]: I1002 13:28:56.171138 4724 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-kvdgz_1de6035f-4c39-40b4-af8b-24fed7520702/kube-rbac-proxy/0.log" Oct 02 13:28:56 crc kubenswrapper[4724]: I1002 13:28:56.332523 4724 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-kvdgz_1de6035f-4c39-40b4-af8b-24fed7520702/speaker/0.log" Oct 02 13:29:04 crc kubenswrapper[4724]: I1002 13:29:04.314294 4724 scope.go:117] "RemoveContainer" containerID="5cb607848e1db258b2539f214e6e2a3e1c03875bea1d097446a579c07b4f8d5a" Oct 02 13:29:04 crc kubenswrapper[4724]: E1002 13:29:04.316333 4724 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-74k4t_openshift-machine-config-operator(f6090eaa-c182-4788-950c-16352c271233)\"" pod="openshift-machine-config-operator/machine-config-daemon-74k4t" podUID="f6090eaa-c182-4788-950c-16352c271233" Oct 02 13:29:10 crc kubenswrapper[4724]: I1002 13:29:10.049249 4724 log.go:25] "Finished parsing log file" path="/var/log/pods/glance-kuttl-tests_glance-cee3-account-create-pmhd5_5d5bb294-8805-4ebb-9ccb-bfa05522a8ea/mariadb-account-create/0.log" Oct 02 13:29:10 crc kubenswrapper[4724]: I1002 13:29:10.245391 4724 log.go:25] "Finished parsing log file" path="/var/log/pods/glance-kuttl-tests_glance-db-create-jbs2p_6aaa371f-e098-479e-a2f5-a90f58ab1e57/mariadb-database-create/0.log" Oct 02 13:29:10 crc kubenswrapper[4724]: I1002 13:29:10.372152 4724 log.go:25] "Finished parsing log file" path="/var/log/pods/glance-kuttl-tests_glance-db-sync-zsx66_29911b4c-644e-4928-b3d4-be90c7009131/glance-db-sync/0.log" Oct 02 13:29:10 crc kubenswrapper[4724]: I1002 13:29:10.468399 4724 log.go:25] "Finished parsing log file" path="/var/log/pods/glance-kuttl-tests_glance-default-external-api-0_240e1579-5ea8-45fe-97e7-707ca8f6622f/glance-api/0.log" Oct 02 13:29:10 crc kubenswrapper[4724]: I1002 13:29:10.516035 4724 log.go:25] "Finished parsing log file" path="/var/log/pods/glance-kuttl-tests_glance-default-external-api-0_240e1579-5ea8-45fe-97e7-707ca8f6622f/glance-httpd/0.log" Oct 02 13:29:10 crc kubenswrapper[4724]: I1002 13:29:10.574233 4724 log.go:25] "Finished parsing log file" path="/var/log/pods/glance-kuttl-tests_glance-default-external-api-0_240e1579-5ea8-45fe-97e7-707ca8f6622f/glance-log/0.log" Oct 02 13:29:10 crc kubenswrapper[4724]: I1002 13:29:10.702153 4724 log.go:25] "Finished parsing log file" path="/var/log/pods/glance-kuttl-tests_glance-default-internal-api-0_7d8fb847-abad-4b54-91db-7da2ad47cfb2/glance-api/0.log" Oct 02 13:29:10 crc kubenswrapper[4724]: I1002 13:29:10.720884 4724 log.go:25] "Finished parsing log file" path="/var/log/pods/glance-kuttl-tests_glance-default-internal-api-0_7d8fb847-abad-4b54-91db-7da2ad47cfb2/glance-httpd/0.log" Oct 02 13:29:10 crc kubenswrapper[4724]: I1002 13:29:10.773308 4724 log.go:25] "Finished parsing log file" path="/var/log/pods/glance-kuttl-tests_glance-default-internal-api-0_7d8fb847-abad-4b54-91db-7da2ad47cfb2/glance-log/0.log" Oct 02 13:29:11 crc kubenswrapper[4724]: I1002 13:29:11.097633 4724 log.go:25] "Finished parsing log file" path="/var/log/pods/glance-kuttl-tests_memcached-0_d97fb367-636d-4a88-ae9f-eca3e33182ac/memcached/0.log" Oct 02 13:29:11 crc kubenswrapper[4724]: I1002 13:29:11.178642 4724 log.go:25] "Finished parsing log file" path="/var/log/pods/glance-kuttl-tests_openstack-galera-0_fa83e829-3df9-425f-91ba-2271f0c201ab/mysql-bootstrap/0.log" Oct 02 13:29:11 crc kubenswrapper[4724]: I1002 13:29:11.213445 4724 log.go:25] "Finished parsing log file" path="/var/log/pods/glance-kuttl-tests_keystone-86d476bccd-dbrt4_a7202e55-07f4-4190-8fc9-7ff3d6c5581f/keystone-api/0.log" Oct 02 13:29:11 crc kubenswrapper[4724]: I1002 13:29:11.363291 4724 log.go:25] "Finished parsing log file" path="/var/log/pods/glance-kuttl-tests_openstack-galera-0_fa83e829-3df9-425f-91ba-2271f0c201ab/mysql-bootstrap/0.log" Oct 02 13:29:11 crc kubenswrapper[4724]: I1002 13:29:11.417136 4724 log.go:25] "Finished parsing log file" path="/var/log/pods/glance-kuttl-tests_openstack-galera-0_fa83e829-3df9-425f-91ba-2271f0c201ab/galera/0.log" Oct 02 13:29:11 crc kubenswrapper[4724]: I1002 13:29:11.461040 4724 log.go:25] "Finished parsing log file" path="/var/log/pods/glance-kuttl-tests_openstack-galera-1_2d30088c-495e-4cd7-892b-f33848b4d5be/mysql-bootstrap/0.log" Oct 02 13:29:11 crc kubenswrapper[4724]: I1002 13:29:11.660918 4724 log.go:25] "Finished parsing log file" path="/var/log/pods/glance-kuttl-tests_openstack-galera-1_2d30088c-495e-4cd7-892b-f33848b4d5be/galera/0.log" Oct 02 13:29:11 crc kubenswrapper[4724]: I1002 13:29:11.699463 4724 log.go:25] "Finished parsing log file" path="/var/log/pods/glance-kuttl-tests_openstack-galera-2_dae47a84-6fb7-42c6-ad5b-415ca57924b3/mysql-bootstrap/0.log" Oct 02 13:29:11 crc kubenswrapper[4724]: I1002 13:29:11.866211 4724 log.go:25] "Finished parsing log file" path="/var/log/pods/glance-kuttl-tests_openstack-galera-1_2d30088c-495e-4cd7-892b-f33848b4d5be/mysql-bootstrap/0.log" Oct 02 13:29:11 crc kubenswrapper[4724]: I1002 13:29:11.915528 4724 log.go:25] "Finished parsing log file" path="/var/log/pods/glance-kuttl-tests_openstack-galera-2_dae47a84-6fb7-42c6-ad5b-415ca57924b3/galera/0.log" Oct 02 13:29:11 crc kubenswrapper[4724]: I1002 13:29:11.936718 4724 log.go:25] "Finished parsing log file" path="/var/log/pods/glance-kuttl-tests_openstack-galera-2_dae47a84-6fb7-42c6-ad5b-415ca57924b3/mysql-bootstrap/0.log" Oct 02 13:29:12 crc kubenswrapper[4724]: I1002 13:29:12.051357 4724 log.go:25] "Finished parsing log file" path="/var/log/pods/glance-kuttl-tests_openstackclient_7be7e11e-a89b-474c-a1e6-e4d8b72e9b02/openstackclient/0.log" Oct 02 13:29:12 crc kubenswrapper[4724]: I1002 13:29:12.130560 4724 log.go:25] "Finished parsing log file" path="/var/log/pods/glance-kuttl-tests_rabbitmq-server-0_cfb3d3fc-ef69-4586-89af-7b9d221d61d7/setup-container/0.log" Oct 02 13:29:12 crc kubenswrapper[4724]: I1002 13:29:12.341631 4724 log.go:25] "Finished parsing log file" path="/var/log/pods/glance-kuttl-tests_rabbitmq-server-0_cfb3d3fc-ef69-4586-89af-7b9d221d61d7/setup-container/0.log" Oct 02 13:29:12 crc kubenswrapper[4724]: I1002 13:29:12.353637 4724 log.go:25] "Finished parsing log file" path="/var/log/pods/glance-kuttl-tests_rabbitmq-server-0_cfb3d3fc-ef69-4586-89af-7b9d221d61d7/rabbitmq/0.log" Oct 02 13:29:12 crc kubenswrapper[4724]: I1002 13:29:12.484032 4724 log.go:25] "Finished parsing log file" path="/var/log/pods/glance-kuttl-tests_swift-proxy-59cb459c9f-9zrgv_ea2cd8ea-3dbd-4076-b104-762a58eb1868/proxy-httpd/0.log" Oct 02 13:29:12 crc kubenswrapper[4724]: I1002 13:29:12.557889 4724 log.go:25] "Finished parsing log file" path="/var/log/pods/glance-kuttl-tests_swift-proxy-59cb459c9f-9zrgv_ea2cd8ea-3dbd-4076-b104-762a58eb1868/proxy-server/0.log" Oct 02 13:29:12 crc kubenswrapper[4724]: I1002 13:29:12.582855 4724 log.go:25] "Finished parsing log file" path="/var/log/pods/glance-kuttl-tests_swift-ring-rebalance-ml7nk_f0e00bd8-bde9-44dd-b71e-ec362b86bd23/swift-ring-rebalance/0.log" Oct 02 13:29:12 crc kubenswrapper[4724]: I1002 13:29:12.748242 4724 log.go:25] "Finished parsing log file" path="/var/log/pods/glance-kuttl-tests_swift-storage-0_bee480a1-1b36-4795-befb-3a6d39bab686/account-reaper/0.log" Oct 02 13:29:12 crc kubenswrapper[4724]: I1002 13:29:12.767097 4724 log.go:25] "Finished parsing log file" path="/var/log/pods/glance-kuttl-tests_swift-storage-0_bee480a1-1b36-4795-befb-3a6d39bab686/account-auditor/0.log" Oct 02 13:29:12 crc kubenswrapper[4724]: I1002 13:29:12.891105 4724 log.go:25] "Finished parsing log file" path="/var/log/pods/glance-kuttl-tests_swift-storage-0_bee480a1-1b36-4795-befb-3a6d39bab686/account-replicator/0.log" Oct 02 13:29:12 crc kubenswrapper[4724]: I1002 13:29:12.960399 4724 log.go:25] "Finished parsing log file" path="/var/log/pods/glance-kuttl-tests_swift-storage-0_bee480a1-1b36-4795-befb-3a6d39bab686/account-server/0.log" Oct 02 13:29:12 crc kubenswrapper[4724]: I1002 13:29:12.998282 4724 log.go:25] "Finished parsing log file" path="/var/log/pods/glance-kuttl-tests_swift-storage-0_bee480a1-1b36-4795-befb-3a6d39bab686/container-replicator/0.log" Oct 02 13:29:13 crc kubenswrapper[4724]: I1002 13:29:13.001031 4724 log.go:25] "Finished parsing log file" path="/var/log/pods/glance-kuttl-tests_swift-storage-0_bee480a1-1b36-4795-befb-3a6d39bab686/container-auditor/0.log" Oct 02 13:29:13 crc kubenswrapper[4724]: I1002 13:29:13.113982 4724 log.go:25] "Finished parsing log file" path="/var/log/pods/glance-kuttl-tests_swift-storage-0_bee480a1-1b36-4795-befb-3a6d39bab686/container-server/0.log" Oct 02 13:29:13 crc kubenswrapper[4724]: I1002 13:29:13.169854 4724 log.go:25] "Finished parsing log file" path="/var/log/pods/glance-kuttl-tests_swift-storage-0_bee480a1-1b36-4795-befb-3a6d39bab686/container-updater/0.log" Oct 02 13:29:13 crc kubenswrapper[4724]: I1002 13:29:13.218728 4724 log.go:25] "Finished parsing log file" path="/var/log/pods/glance-kuttl-tests_swift-storage-0_bee480a1-1b36-4795-befb-3a6d39bab686/object-auditor/0.log" Oct 02 13:29:13 crc kubenswrapper[4724]: I1002 13:29:13.266094 4724 log.go:25] "Finished parsing log file" path="/var/log/pods/glance-kuttl-tests_swift-storage-0_bee480a1-1b36-4795-befb-3a6d39bab686/object-expirer/0.log" Oct 02 13:29:13 crc kubenswrapper[4724]: I1002 13:29:13.316331 4724 log.go:25] "Finished parsing log file" path="/var/log/pods/glance-kuttl-tests_swift-storage-0_bee480a1-1b36-4795-befb-3a6d39bab686/object-server/0.log" Oct 02 13:29:13 crc kubenswrapper[4724]: I1002 13:29:13.331406 4724 log.go:25] "Finished parsing log file" path="/var/log/pods/glance-kuttl-tests_swift-storage-0_bee480a1-1b36-4795-befb-3a6d39bab686/object-replicator/0.log" Oct 02 13:29:13 crc kubenswrapper[4724]: I1002 13:29:13.408929 4724 log.go:25] "Finished parsing log file" path="/var/log/pods/glance-kuttl-tests_swift-storage-0_bee480a1-1b36-4795-befb-3a6d39bab686/object-updater/0.log" Oct 02 13:29:13 crc kubenswrapper[4724]: I1002 13:29:13.436700 4724 log.go:25] "Finished parsing log file" path="/var/log/pods/glance-kuttl-tests_swift-storage-0_bee480a1-1b36-4795-befb-3a6d39bab686/rsync/0.log" Oct 02 13:29:13 crc kubenswrapper[4724]: I1002 13:29:13.458249 4724 log.go:25] "Finished parsing log file" path="/var/log/pods/glance-kuttl-tests_swift-storage-0_bee480a1-1b36-4795-befb-3a6d39bab686/swift-recon-cron/0.log" Oct 02 13:29:18 crc kubenswrapper[4724]: I1002 13:29:18.313708 4724 scope.go:117] "RemoveContainer" containerID="5cb607848e1db258b2539f214e6e2a3e1c03875bea1d097446a579c07b4f8d5a" Oct 02 13:29:18 crc kubenswrapper[4724]: E1002 13:29:18.314364 4724 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-74k4t_openshift-machine-config-operator(f6090eaa-c182-4788-950c-16352c271233)\"" pod="openshift-machine-config-operator/machine-config-daemon-74k4t" podUID="f6090eaa-c182-4788-950c-16352c271233" Oct 02 13:29:26 crc kubenswrapper[4724]: I1002 13:29:26.493079 4724 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2gkbdx_7c42221c-4802-40e9-b4ef-244e6e79d969/util/0.log" Oct 02 13:29:26 crc kubenswrapper[4724]: I1002 13:29:26.668186 4724 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2gkbdx_7c42221c-4802-40e9-b4ef-244e6e79d969/pull/0.log" Oct 02 13:29:26 crc kubenswrapper[4724]: I1002 13:29:26.668314 4724 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2gkbdx_7c42221c-4802-40e9-b4ef-244e6e79d969/pull/0.log" Oct 02 13:29:26 crc kubenswrapper[4724]: I1002 13:29:26.804265 4724 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2gkbdx_7c42221c-4802-40e9-b4ef-244e6e79d969/util/0.log" Oct 02 13:29:26 crc kubenswrapper[4724]: I1002 13:29:26.871574 4724 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2gkbdx_7c42221c-4802-40e9-b4ef-244e6e79d969/extract/0.log" Oct 02 13:29:26 crc kubenswrapper[4724]: I1002 13:29:26.877131 4724 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2gkbdx_7c42221c-4802-40e9-b4ef-244e6e79d969/util/0.log" Oct 02 13:29:26 crc kubenswrapper[4724]: I1002 13:29:26.896056 4724 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2gkbdx_7c42221c-4802-40e9-b4ef-244e6e79d969/pull/0.log" Oct 02 13:29:27 crc kubenswrapper[4724]: I1002 13:29:27.204325 4724 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-fmg79_5a6d4c8e-6d6b-4da3-b36c-7f8c4bd6c469/extract-utilities/0.log" Oct 02 13:29:27 crc kubenswrapper[4724]: I1002 13:29:27.410568 4724 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-fmg79_5a6d4c8e-6d6b-4da3-b36c-7f8c4bd6c469/extract-utilities/0.log" Oct 02 13:29:27 crc kubenswrapper[4724]: I1002 13:29:27.450739 4724 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-fmg79_5a6d4c8e-6d6b-4da3-b36c-7f8c4bd6c469/extract-content/0.log" Oct 02 13:29:27 crc kubenswrapper[4724]: I1002 13:29:27.453115 4724 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-fmg79_5a6d4c8e-6d6b-4da3-b36c-7f8c4bd6c469/extract-content/0.log" Oct 02 13:29:27 crc kubenswrapper[4724]: I1002 13:29:27.637364 4724 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-fmg79_5a6d4c8e-6d6b-4da3-b36c-7f8c4bd6c469/extract-utilities/0.log" Oct 02 13:29:27 crc kubenswrapper[4724]: I1002 13:29:27.647597 4724 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-fmg79_5a6d4c8e-6d6b-4da3-b36c-7f8c4bd6c469/extract-content/0.log" Oct 02 13:29:27 crc kubenswrapper[4724]: I1002 13:29:27.895191 4724 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-kz8p5_bf1c3eef-48b3-4563-99fa-0a33d1c6835a/extract-utilities/0.log" Oct 02 13:29:28 crc kubenswrapper[4724]: I1002 13:29:28.097121 4724 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-kz8p5_bf1c3eef-48b3-4563-99fa-0a33d1c6835a/extract-utilities/0.log" Oct 02 13:29:28 crc kubenswrapper[4724]: I1002 13:29:28.138007 4724 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-kz8p5_bf1c3eef-48b3-4563-99fa-0a33d1c6835a/extract-content/0.log" Oct 02 13:29:28 crc kubenswrapper[4724]: I1002 13:29:28.239354 4724 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-fmg79_5a6d4c8e-6d6b-4da3-b36c-7f8c4bd6c469/registry-server/0.log" Oct 02 13:29:28 crc kubenswrapper[4724]: I1002 13:29:28.257261 4724 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-kz8p5_bf1c3eef-48b3-4563-99fa-0a33d1c6835a/extract-content/0.log" Oct 02 13:29:28 crc kubenswrapper[4724]: I1002 13:29:28.377103 4724 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-kz8p5_bf1c3eef-48b3-4563-99fa-0a33d1c6835a/extract-utilities/0.log" Oct 02 13:29:28 crc kubenswrapper[4724]: I1002 13:29:28.390079 4724 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-kz8p5_bf1c3eef-48b3-4563-99fa-0a33d1c6835a/extract-content/0.log" Oct 02 13:29:28 crc kubenswrapper[4724]: I1002 13:29:28.605935 4724 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-rp6lr_fc56248d-2826-4811-97eb-86c6ffa04f61/marketplace-operator/0.log" Oct 02 13:29:28 crc kubenswrapper[4724]: I1002 13:29:28.804362 4724 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-t7kvp_916861ea-15de-4a65-b053-4474fb141748/extract-utilities/0.log" Oct 02 13:29:28 crc kubenswrapper[4724]: I1002 13:29:28.816843 4724 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-kz8p5_bf1c3eef-48b3-4563-99fa-0a33d1c6835a/registry-server/0.log" Oct 02 13:29:28 crc kubenswrapper[4724]: I1002 13:29:28.989035 4724 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-t7kvp_916861ea-15de-4a65-b053-4474fb141748/extract-content/0.log" Oct 02 13:29:29 crc kubenswrapper[4724]: I1002 13:29:29.035555 4724 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-t7kvp_916861ea-15de-4a65-b053-4474fb141748/extract-content/0.log" Oct 02 13:29:29 crc kubenswrapper[4724]: I1002 13:29:29.045473 4724 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-t7kvp_916861ea-15de-4a65-b053-4474fb141748/extract-utilities/0.log" Oct 02 13:29:29 crc kubenswrapper[4724]: I1002 13:29:29.293686 4724 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-t7kvp_916861ea-15de-4a65-b053-4474fb141748/extract-utilities/0.log" Oct 02 13:29:29 crc kubenswrapper[4724]: I1002 13:29:29.332511 4724 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-t7kvp_916861ea-15de-4a65-b053-4474fb141748/extract-content/0.log" Oct 02 13:29:29 crc kubenswrapper[4724]: I1002 13:29:29.460205 4724 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-t7kvp_916861ea-15de-4a65-b053-4474fb141748/registry-server/0.log" Oct 02 13:29:29 crc kubenswrapper[4724]: I1002 13:29:29.501456 4724 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-qvzc9_8ce22671-98d5-4e0e-9851-7da087e63499/extract-utilities/0.log" Oct 02 13:29:29 crc kubenswrapper[4724]: I1002 13:29:29.675153 4724 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-qvzc9_8ce22671-98d5-4e0e-9851-7da087e63499/extract-content/0.log" Oct 02 13:29:29 crc kubenswrapper[4724]: I1002 13:29:29.709163 4724 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-qvzc9_8ce22671-98d5-4e0e-9851-7da087e63499/extract-utilities/0.log" Oct 02 13:29:29 crc kubenswrapper[4724]: I1002 13:29:29.730093 4724 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-qvzc9_8ce22671-98d5-4e0e-9851-7da087e63499/extract-content/0.log" Oct 02 13:29:29 crc kubenswrapper[4724]: I1002 13:29:29.908343 4724 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-qvzc9_8ce22671-98d5-4e0e-9851-7da087e63499/extract-content/0.log" Oct 02 13:29:29 crc kubenswrapper[4724]: I1002 13:29:29.939685 4724 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-qvzc9_8ce22671-98d5-4e0e-9851-7da087e63499/extract-utilities/0.log" Oct 02 13:29:30 crc kubenswrapper[4724]: I1002 13:29:30.401828 4724 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-qvzc9_8ce22671-98d5-4e0e-9851-7da087e63499/registry-server/0.log" Oct 02 13:29:31 crc kubenswrapper[4724]: I1002 13:29:31.313464 4724 scope.go:117] "RemoveContainer" containerID="5cb607848e1db258b2539f214e6e2a3e1c03875bea1d097446a579c07b4f8d5a" Oct 02 13:29:31 crc kubenswrapper[4724]: E1002 13:29:31.314128 4724 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-74k4t_openshift-machine-config-operator(f6090eaa-c182-4788-950c-16352c271233)\"" pod="openshift-machine-config-operator/machine-config-daemon-74k4t" podUID="f6090eaa-c182-4788-950c-16352c271233" Oct 02 13:29:34 crc kubenswrapper[4724]: I1002 13:29:34.041465 4724 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-db-create-jbs2p"] Oct 02 13:29:34 crc kubenswrapper[4724]: I1002 13:29:34.047479 4724 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/glance-db-create-jbs2p"] Oct 02 13:29:34 crc kubenswrapper[4724]: I1002 13:29:34.321599 4724 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6aaa371f-e098-479e-a2f5-a90f58ab1e57" path="/var/lib/kubelet/pods/6aaa371f-e098-479e-a2f5-a90f58ab1e57/volumes" Oct 02 13:29:43 crc kubenswrapper[4724]: I1002 13:29:43.021389 4724 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-cee3-account-create-pmhd5"] Oct 02 13:29:43 crc kubenswrapper[4724]: I1002 13:29:43.029648 4724 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/glance-cee3-account-create-pmhd5"] Oct 02 13:29:44 crc kubenswrapper[4724]: I1002 13:29:44.322578 4724 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5d5bb294-8805-4ebb-9ccb-bfa05522a8ea" path="/var/lib/kubelet/pods/5d5bb294-8805-4ebb-9ccb-bfa05522a8ea/volumes" Oct 02 13:29:46 crc kubenswrapper[4724]: I1002 13:29:46.325268 4724 scope.go:117] "RemoveContainer" containerID="5cb607848e1db258b2539f214e6e2a3e1c03875bea1d097446a579c07b4f8d5a" Oct 02 13:29:46 crc kubenswrapper[4724]: E1002 13:29:46.335010 4724 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-74k4t_openshift-machine-config-operator(f6090eaa-c182-4788-950c-16352c271233)\"" pod="openshift-machine-config-operator/machine-config-daemon-74k4t" podUID="f6090eaa-c182-4788-950c-16352c271233" Oct 02 13:29:52 crc kubenswrapper[4724]: I1002 13:29:52.040393 4724 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-db-sync-zsx66"] Oct 02 13:29:52 crc kubenswrapper[4724]: I1002 13:29:52.062213 4724 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/glance-db-sync-zsx66"] Oct 02 13:29:52 crc kubenswrapper[4724]: I1002 13:29:52.329070 4724 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="29911b4c-644e-4928-b3d4-be90c7009131" path="/var/lib/kubelet/pods/29911b4c-644e-4928-b3d4-be90c7009131/volumes" Oct 02 13:29:57 crc kubenswrapper[4724]: I1002 13:29:57.721113 4724 scope.go:117] "RemoveContainer" containerID="33bb4711255d18e7877afd8ef4db93bcfc508253e09d25b0fd3e486b235bf183" Oct 02 13:29:57 crc kubenswrapper[4724]: I1002 13:29:57.740618 4724 scope.go:117] "RemoveContainer" containerID="cb75b5491f47c29561dac07adb19990f1f5d516da1a88e9854f672df3422990d" Oct 02 13:29:57 crc kubenswrapper[4724]: I1002 13:29:57.796011 4724 scope.go:117] "RemoveContainer" containerID="63b398025309fdd40adac91eb83dc0fccf1d1314ad7e354307fc0f2a7cce76ba" Oct 02 13:30:00 crc kubenswrapper[4724]: I1002 13:30:00.151955 4724 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/glance-cache-glance-default-external-api-0-cleaner-29323538brpl"] Oct 02 13:30:00 crc kubenswrapper[4724]: I1002 13:30:00.153440 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-cache-glance-default-external-api-0-cleaner-29323538brpl" Oct 02 13:30:00 crc kubenswrapper[4724]: I1002 13:30:00.159898 4724 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29323530-kzdnt"] Oct 02 13:30:00 crc kubenswrapper[4724]: I1002 13:30:00.161098 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29323530-kzdnt" Oct 02 13:30:00 crc kubenswrapper[4724]: I1002 13:30:00.167081 4724 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Oct 02 13:30:00 crc kubenswrapper[4724]: I1002 13:30:00.167470 4724 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Oct 02 13:30:00 crc kubenswrapper[4724]: I1002 13:30:00.169295 4724 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-cache-glance-default-external-api-0-cleaner-29323538brpl"] Oct 02 13:30:00 crc kubenswrapper[4724]: I1002 13:30:00.183032 4724 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29323530-kzdnt"] Oct 02 13:30:00 crc kubenswrapper[4724]: I1002 13:30:00.202062 4724 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/glance-cache-glance-default-internal-api-0-cleaner-2932353b7j22"] Oct 02 13:30:00 crc kubenswrapper[4724]: I1002 13:30:00.203129 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-cache-glance-default-internal-api-0-cleaner-2932353b7j22" Oct 02 13:30:00 crc kubenswrapper[4724]: I1002 13:30:00.230189 4724 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-cache-glance-default-internal-api-0-cleaner-2932353b7j22"] Oct 02 13:30:00 crc kubenswrapper[4724]: I1002 13:30:00.263126 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-cache-config-data\" (UniqueName: \"kubernetes.io/secret/62665e12-a18e-4b6c-b1e7-dda7ca1a7d36-image-cache-config-data\") pod \"glance-cache-glance-default-internal-api-0-cleaner-2932353b7j22\" (UID: \"62665e12-a18e-4b6c-b1e7-dda7ca1a7d36\") " pod="glance-kuttl-tests/glance-cache-glance-default-internal-api-0-cleaner-2932353b7j22" Oct 02 13:30:00 crc kubenswrapper[4724]: I1002 13:30:00.263176 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-cache-config-data\" (UniqueName: \"kubernetes.io/secret/db162d47-0532-4318-b29e-ac37f78a58dd-image-cache-config-data\") pod \"glance-cache-glance-default-external-api-0-cleaner-29323538brpl\" (UID: \"db162d47-0532-4318-b29e-ac37f78a58dd\") " pod="glance-kuttl-tests/glance-cache-glance-default-external-api-0-cleaner-29323538brpl" Oct 02 13:30:00 crc kubenswrapper[4724]: I1002 13:30:00.263456 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-cache-glance-default-external-api-0-cleaner-29323538brpl\" (UID: \"db162d47-0532-4318-b29e-ac37f78a58dd\") " pod="glance-kuttl-tests/glance-cache-glance-default-external-api-0-cleaner-29323538brpl" Oct 02 13:30:00 crc kubenswrapper[4724]: I1002 13:30:00.263490 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c02cf745-080b-4911-9986-21363b80638f-config-volume\") pod \"collect-profiles-29323530-kzdnt\" (UID: \"c02cf745-080b-4911-9986-21363b80638f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323530-kzdnt" Oct 02 13:30:00 crc kubenswrapper[4724]: I1002 13:30:00.263518 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vddgq\" (UniqueName: \"kubernetes.io/projected/db162d47-0532-4318-b29e-ac37f78a58dd-kube-api-access-vddgq\") pod \"glance-cache-glance-default-external-api-0-cleaner-29323538brpl\" (UID: \"db162d47-0532-4318-b29e-ac37f78a58dd\") " pod="glance-kuttl-tests/glance-cache-glance-default-external-api-0-cleaner-29323538brpl" Oct 02 13:30:00 crc kubenswrapper[4724]: I1002 13:30:00.263567 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-szzc6\" (UniqueName: \"kubernetes.io/projected/c02cf745-080b-4911-9986-21363b80638f-kube-api-access-szzc6\") pod \"collect-profiles-29323530-kzdnt\" (UID: \"c02cf745-080b-4911-9986-21363b80638f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323530-kzdnt" Oct 02 13:30:00 crc kubenswrapper[4724]: I1002 13:30:00.263595 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gnshl\" (UniqueName: \"kubernetes.io/projected/62665e12-a18e-4b6c-b1e7-dda7ca1a7d36-kube-api-access-gnshl\") pod \"glance-cache-glance-default-internal-api-0-cleaner-2932353b7j22\" (UID: \"62665e12-a18e-4b6c-b1e7-dda7ca1a7d36\") " pod="glance-kuttl-tests/glance-cache-glance-default-internal-api-0-cleaner-2932353b7j22" Oct 02 13:30:00 crc kubenswrapper[4724]: I1002 13:30:00.263619 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-cache-glance-default-internal-api-0-cleaner-2932353b7j22\" (UID: \"62665e12-a18e-4b6c-b1e7-dda7ca1a7d36\") " pod="glance-kuttl-tests/glance-cache-glance-default-internal-api-0-cleaner-2932353b7j22" Oct 02 13:30:00 crc kubenswrapper[4724]: I1002 13:30:00.263647 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c02cf745-080b-4911-9986-21363b80638f-secret-volume\") pod \"collect-profiles-29323530-kzdnt\" (UID: \"c02cf745-080b-4911-9986-21363b80638f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323530-kzdnt" Oct 02 13:30:00 crc kubenswrapper[4724]: I1002 13:30:00.303288 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-cache-glance-default-external-api-0-cleaner-29323538brpl\" (UID: \"db162d47-0532-4318-b29e-ac37f78a58dd\") " pod="glance-kuttl-tests/glance-cache-glance-default-external-api-0-cleaner-29323538brpl" Oct 02 13:30:00 crc kubenswrapper[4724]: I1002 13:30:00.308240 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-cache-glance-default-internal-api-0-cleaner-2932353b7j22\" (UID: \"62665e12-a18e-4b6c-b1e7-dda7ca1a7d36\") " pod="glance-kuttl-tests/glance-cache-glance-default-internal-api-0-cleaner-2932353b7j22" Oct 02 13:30:00 crc kubenswrapper[4724]: I1002 13:30:00.314549 4724 scope.go:117] "RemoveContainer" containerID="5cb607848e1db258b2539f214e6e2a3e1c03875bea1d097446a579c07b4f8d5a" Oct 02 13:30:00 crc kubenswrapper[4724]: E1002 13:30:00.315176 4724 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-74k4t_openshift-machine-config-operator(f6090eaa-c182-4788-950c-16352c271233)\"" pod="openshift-machine-config-operator/machine-config-daemon-74k4t" podUID="f6090eaa-c182-4788-950c-16352c271233" Oct 02 13:30:00 crc kubenswrapper[4724]: I1002 13:30:00.365657 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c02cf745-080b-4911-9986-21363b80638f-secret-volume\") pod \"collect-profiles-29323530-kzdnt\" (UID: \"c02cf745-080b-4911-9986-21363b80638f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323530-kzdnt" Oct 02 13:30:00 crc kubenswrapper[4724]: I1002 13:30:00.365805 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-cache-config-data\" (UniqueName: \"kubernetes.io/secret/62665e12-a18e-4b6c-b1e7-dda7ca1a7d36-image-cache-config-data\") pod \"glance-cache-glance-default-internal-api-0-cleaner-2932353b7j22\" (UID: \"62665e12-a18e-4b6c-b1e7-dda7ca1a7d36\") " pod="glance-kuttl-tests/glance-cache-glance-default-internal-api-0-cleaner-2932353b7j22" Oct 02 13:30:00 crc kubenswrapper[4724]: I1002 13:30:00.365831 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-cache-config-data\" (UniqueName: \"kubernetes.io/secret/db162d47-0532-4318-b29e-ac37f78a58dd-image-cache-config-data\") pod \"glance-cache-glance-default-external-api-0-cleaner-29323538brpl\" (UID: \"db162d47-0532-4318-b29e-ac37f78a58dd\") " pod="glance-kuttl-tests/glance-cache-glance-default-external-api-0-cleaner-29323538brpl" Oct 02 13:30:00 crc kubenswrapper[4724]: I1002 13:30:00.365927 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c02cf745-080b-4911-9986-21363b80638f-config-volume\") pod \"collect-profiles-29323530-kzdnt\" (UID: \"c02cf745-080b-4911-9986-21363b80638f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323530-kzdnt" Oct 02 13:30:00 crc kubenswrapper[4724]: I1002 13:30:00.366562 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vddgq\" (UniqueName: \"kubernetes.io/projected/db162d47-0532-4318-b29e-ac37f78a58dd-kube-api-access-vddgq\") pod \"glance-cache-glance-default-external-api-0-cleaner-29323538brpl\" (UID: \"db162d47-0532-4318-b29e-ac37f78a58dd\") " pod="glance-kuttl-tests/glance-cache-glance-default-external-api-0-cleaner-29323538brpl" Oct 02 13:30:00 crc kubenswrapper[4724]: I1002 13:30:00.366603 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-szzc6\" (UniqueName: \"kubernetes.io/projected/c02cf745-080b-4911-9986-21363b80638f-kube-api-access-szzc6\") pod \"collect-profiles-29323530-kzdnt\" (UID: \"c02cf745-080b-4911-9986-21363b80638f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323530-kzdnt" Oct 02 13:30:00 crc kubenswrapper[4724]: I1002 13:30:00.366641 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gnshl\" (UniqueName: \"kubernetes.io/projected/62665e12-a18e-4b6c-b1e7-dda7ca1a7d36-kube-api-access-gnshl\") pod \"glance-cache-glance-default-internal-api-0-cleaner-2932353b7j22\" (UID: \"62665e12-a18e-4b6c-b1e7-dda7ca1a7d36\") " pod="glance-kuttl-tests/glance-cache-glance-default-internal-api-0-cleaner-2932353b7j22" Oct 02 13:30:00 crc kubenswrapper[4724]: I1002 13:30:00.367334 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c02cf745-080b-4911-9986-21363b80638f-config-volume\") pod \"collect-profiles-29323530-kzdnt\" (UID: \"c02cf745-080b-4911-9986-21363b80638f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323530-kzdnt" Oct 02 13:30:00 crc kubenswrapper[4724]: I1002 13:30:00.370343 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-cache-config-data\" (UniqueName: \"kubernetes.io/secret/db162d47-0532-4318-b29e-ac37f78a58dd-image-cache-config-data\") pod \"glance-cache-glance-default-external-api-0-cleaner-29323538brpl\" (UID: \"db162d47-0532-4318-b29e-ac37f78a58dd\") " pod="glance-kuttl-tests/glance-cache-glance-default-external-api-0-cleaner-29323538brpl" Oct 02 13:30:00 crc kubenswrapper[4724]: I1002 13:30:00.371687 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c02cf745-080b-4911-9986-21363b80638f-secret-volume\") pod \"collect-profiles-29323530-kzdnt\" (UID: \"c02cf745-080b-4911-9986-21363b80638f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323530-kzdnt" Oct 02 13:30:00 crc kubenswrapper[4724]: I1002 13:30:00.371859 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-cache-config-data\" (UniqueName: \"kubernetes.io/secret/62665e12-a18e-4b6c-b1e7-dda7ca1a7d36-image-cache-config-data\") pod \"glance-cache-glance-default-internal-api-0-cleaner-2932353b7j22\" (UID: \"62665e12-a18e-4b6c-b1e7-dda7ca1a7d36\") " pod="glance-kuttl-tests/glance-cache-glance-default-internal-api-0-cleaner-2932353b7j22" Oct 02 13:30:00 crc kubenswrapper[4724]: I1002 13:30:00.385458 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-szzc6\" (UniqueName: \"kubernetes.io/projected/c02cf745-080b-4911-9986-21363b80638f-kube-api-access-szzc6\") pod \"collect-profiles-29323530-kzdnt\" (UID: \"c02cf745-080b-4911-9986-21363b80638f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323530-kzdnt" Oct 02 13:30:00 crc kubenswrapper[4724]: I1002 13:30:00.386388 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gnshl\" (UniqueName: \"kubernetes.io/projected/62665e12-a18e-4b6c-b1e7-dda7ca1a7d36-kube-api-access-gnshl\") pod \"glance-cache-glance-default-internal-api-0-cleaner-2932353b7j22\" (UID: \"62665e12-a18e-4b6c-b1e7-dda7ca1a7d36\") " pod="glance-kuttl-tests/glance-cache-glance-default-internal-api-0-cleaner-2932353b7j22" Oct 02 13:30:00 crc kubenswrapper[4724]: I1002 13:30:00.390085 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vddgq\" (UniqueName: \"kubernetes.io/projected/db162d47-0532-4318-b29e-ac37f78a58dd-kube-api-access-vddgq\") pod \"glance-cache-glance-default-external-api-0-cleaner-29323538brpl\" (UID: \"db162d47-0532-4318-b29e-ac37f78a58dd\") " pod="glance-kuttl-tests/glance-cache-glance-default-external-api-0-cleaner-29323538brpl" Oct 02 13:30:00 crc kubenswrapper[4724]: I1002 13:30:00.482438 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-cache-glance-default-external-api-0-cleaner-29323538brpl" Oct 02 13:30:00 crc kubenswrapper[4724]: I1002 13:30:00.492233 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29323530-kzdnt" Oct 02 13:30:00 crc kubenswrapper[4724]: I1002 13:30:00.533898 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-cache-glance-default-internal-api-0-cleaner-2932353b7j22" Oct 02 13:30:00 crc kubenswrapper[4724]: I1002 13:30:00.930282 4724 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-cache-glance-default-external-api-0-cleaner-29323538brpl"] Oct 02 13:30:01 crc kubenswrapper[4724]: I1002 13:30:01.009898 4724 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29323530-kzdnt"] Oct 02 13:30:01 crc kubenswrapper[4724]: I1002 13:30:01.074340 4724 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-cache-glance-default-internal-api-0-cleaner-2932353b7j22"] Oct 02 13:30:01 crc kubenswrapper[4724]: W1002 13:30:01.088457 4724 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod62665e12_a18e_4b6c_b1e7_dda7ca1a7d36.slice/crio-e8463b93819987b33134c152f7c02a6c3d739f055c2f7e8c323654866bdd6b31 WatchSource:0}: Error finding container e8463b93819987b33134c152f7c02a6c3d739f055c2f7e8c323654866bdd6b31: Status 404 returned error can't find the container with id e8463b93819987b33134c152f7c02a6c3d739f055c2f7e8c323654866bdd6b31 Oct 02 13:30:01 crc kubenswrapper[4724]: I1002 13:30:01.135710 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29323530-kzdnt" event={"ID":"c02cf745-080b-4911-9986-21363b80638f","Type":"ContainerStarted","Data":"4fe7b5e00cfef1f269159495c2be39cc984612c324ed621a8d3609a8336af164"} Oct 02 13:30:01 crc kubenswrapper[4724]: I1002 13:30:01.137685 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-cache-glance-default-internal-api-0-cleaner-2932353b7j22" event={"ID":"62665e12-a18e-4b6c-b1e7-dda7ca1a7d36","Type":"ContainerStarted","Data":"e8463b93819987b33134c152f7c02a6c3d739f055c2f7e8c323654866bdd6b31"} Oct 02 13:30:01 crc kubenswrapper[4724]: I1002 13:30:01.139690 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-cache-glance-default-external-api-0-cleaner-29323538brpl" event={"ID":"db162d47-0532-4318-b29e-ac37f78a58dd","Type":"ContainerStarted","Data":"27dba60e659143f6bf285839d8b14a81555e5b2587e0d511b7de5027dd822b10"} Oct 02 13:30:02 crc kubenswrapper[4724]: I1002 13:30:02.149989 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-cache-glance-default-internal-api-0-cleaner-2932353b7j22" event={"ID":"62665e12-a18e-4b6c-b1e7-dda7ca1a7d36","Type":"ContainerStarted","Data":"094b22ad33e65dcb051143ca6a34a338fd5332a2af5a764e6ab47c045c9643ad"} Oct 02 13:30:02 crc kubenswrapper[4724]: I1002 13:30:02.151774 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-cache-glance-default-external-api-0-cleaner-29323538brpl" event={"ID":"db162d47-0532-4318-b29e-ac37f78a58dd","Type":"ContainerStarted","Data":"be1db3fbfb1ab7c496cbabb27271fc85b2092c44fa2f6e8c3daa8202853b79b2"} Oct 02 13:30:02 crc kubenswrapper[4724]: I1002 13:30:02.153514 4724 generic.go:334] "Generic (PLEG): container finished" podID="c02cf745-080b-4911-9986-21363b80638f" containerID="6f8af37aaa211ee3cf1009ec6221165bec65a053a561716f1726079aa7eb2951" exitCode=0 Oct 02 13:30:02 crc kubenswrapper[4724]: I1002 13:30:02.153589 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29323530-kzdnt" event={"ID":"c02cf745-080b-4911-9986-21363b80638f","Type":"ContainerDied","Data":"6f8af37aaa211ee3cf1009ec6221165bec65a053a561716f1726079aa7eb2951"} Oct 02 13:30:02 crc kubenswrapper[4724]: I1002 13:30:02.170713 4724 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="glance-kuttl-tests/glance-cache-glance-default-internal-api-0-cleaner-2932353b7j22" podStartSLOduration=2.170687194 podStartE2EDuration="2.170687194s" podCreationTimestamp="2025-10-02 13:30:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 13:30:02.165923451 +0000 UTC m=+1866.620682572" watchObservedRunningTime="2025-10-02 13:30:02.170687194 +0000 UTC m=+1866.625446315" Oct 02 13:30:02 crc kubenswrapper[4724]: I1002 13:30:02.212200 4724 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="glance-kuttl-tests/glance-cache-glance-default-external-api-0-cleaner-29323538brpl" podStartSLOduration=2.212178487 podStartE2EDuration="2.212178487s" podCreationTimestamp="2025-10-02 13:30:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 13:30:02.206797728 +0000 UTC m=+1866.661556849" watchObservedRunningTime="2025-10-02 13:30:02.212178487 +0000 UTC m=+1866.666937608" Oct 02 13:30:03 crc kubenswrapper[4724]: I1002 13:30:03.163333 4724 generic.go:334] "Generic (PLEG): container finished" podID="62665e12-a18e-4b6c-b1e7-dda7ca1a7d36" containerID="094b22ad33e65dcb051143ca6a34a338fd5332a2af5a764e6ab47c045c9643ad" exitCode=0 Oct 02 13:30:03 crc kubenswrapper[4724]: I1002 13:30:03.163413 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-cache-glance-default-internal-api-0-cleaner-2932353b7j22" event={"ID":"62665e12-a18e-4b6c-b1e7-dda7ca1a7d36","Type":"ContainerDied","Data":"094b22ad33e65dcb051143ca6a34a338fd5332a2af5a764e6ab47c045c9643ad"} Oct 02 13:30:03 crc kubenswrapper[4724]: I1002 13:30:03.170765 4724 generic.go:334] "Generic (PLEG): container finished" podID="db162d47-0532-4318-b29e-ac37f78a58dd" containerID="be1db3fbfb1ab7c496cbabb27271fc85b2092c44fa2f6e8c3daa8202853b79b2" exitCode=0 Oct 02 13:30:03 crc kubenswrapper[4724]: I1002 13:30:03.170974 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-cache-glance-default-external-api-0-cleaner-29323538brpl" event={"ID":"db162d47-0532-4318-b29e-ac37f78a58dd","Type":"ContainerDied","Data":"be1db3fbfb1ab7c496cbabb27271fc85b2092c44fa2f6e8c3daa8202853b79b2"} Oct 02 13:30:03 crc kubenswrapper[4724]: I1002 13:30:03.465672 4724 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29323530-kzdnt" Oct 02 13:30:03 crc kubenswrapper[4724]: I1002 13:30:03.636980 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c02cf745-080b-4911-9986-21363b80638f-config-volume\") pod \"c02cf745-080b-4911-9986-21363b80638f\" (UID: \"c02cf745-080b-4911-9986-21363b80638f\") " Oct 02 13:30:03 crc kubenswrapper[4724]: I1002 13:30:03.637029 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c02cf745-080b-4911-9986-21363b80638f-secret-volume\") pod \"c02cf745-080b-4911-9986-21363b80638f\" (UID: \"c02cf745-080b-4911-9986-21363b80638f\") " Oct 02 13:30:03 crc kubenswrapper[4724]: I1002 13:30:03.637177 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-szzc6\" (UniqueName: \"kubernetes.io/projected/c02cf745-080b-4911-9986-21363b80638f-kube-api-access-szzc6\") pod \"c02cf745-080b-4911-9986-21363b80638f\" (UID: \"c02cf745-080b-4911-9986-21363b80638f\") " Oct 02 13:30:03 crc kubenswrapper[4724]: I1002 13:30:03.637433 4724 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c02cf745-080b-4911-9986-21363b80638f-config-volume" (OuterVolumeSpecName: "config-volume") pod "c02cf745-080b-4911-9986-21363b80638f" (UID: "c02cf745-080b-4911-9986-21363b80638f"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 13:30:03 crc kubenswrapper[4724]: I1002 13:30:03.637662 4724 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c02cf745-080b-4911-9986-21363b80638f-config-volume\") on node \"crc\" DevicePath \"\"" Oct 02 13:30:03 crc kubenswrapper[4724]: I1002 13:30:03.643081 4724 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c02cf745-080b-4911-9986-21363b80638f-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "c02cf745-080b-4911-9986-21363b80638f" (UID: "c02cf745-080b-4911-9986-21363b80638f"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 13:30:03 crc kubenswrapper[4724]: I1002 13:30:03.644432 4724 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c02cf745-080b-4911-9986-21363b80638f-kube-api-access-szzc6" (OuterVolumeSpecName: "kube-api-access-szzc6") pod "c02cf745-080b-4911-9986-21363b80638f" (UID: "c02cf745-080b-4911-9986-21363b80638f"). InnerVolumeSpecName "kube-api-access-szzc6". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 13:30:03 crc kubenswrapper[4724]: I1002 13:30:03.739374 4724 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-szzc6\" (UniqueName: \"kubernetes.io/projected/c02cf745-080b-4911-9986-21363b80638f-kube-api-access-szzc6\") on node \"crc\" DevicePath \"\"" Oct 02 13:30:03 crc kubenswrapper[4724]: I1002 13:30:03.739420 4724 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c02cf745-080b-4911-9986-21363b80638f-secret-volume\") on node \"crc\" DevicePath \"\"" Oct 02 13:30:04 crc kubenswrapper[4724]: I1002 13:30:04.179824 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29323530-kzdnt" event={"ID":"c02cf745-080b-4911-9986-21363b80638f","Type":"ContainerDied","Data":"4fe7b5e00cfef1f269159495c2be39cc984612c324ed621a8d3609a8336af164"} Oct 02 13:30:04 crc kubenswrapper[4724]: I1002 13:30:04.179902 4724 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4fe7b5e00cfef1f269159495c2be39cc984612c324ed621a8d3609a8336af164" Oct 02 13:30:04 crc kubenswrapper[4724]: I1002 13:30:04.179870 4724 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29323530-kzdnt" Oct 02 13:30:04 crc kubenswrapper[4724]: I1002 13:30:04.517457 4724 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-cache-glance-default-external-api-0-cleaner-29323538brpl" Oct 02 13:30:04 crc kubenswrapper[4724]: I1002 13:30:04.524055 4724 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-cache-glance-default-internal-api-0-cleaner-2932353b7j22" Oct 02 13:30:04 crc kubenswrapper[4724]: I1002 13:30:04.654306 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-cache-config-data\" (UniqueName: \"kubernetes.io/secret/62665e12-a18e-4b6c-b1e7-dda7ca1a7d36-image-cache-config-data\") pod \"62665e12-a18e-4b6c-b1e7-dda7ca1a7d36\" (UID: \"62665e12-a18e-4b6c-b1e7-dda7ca1a7d36\") " Oct 02 13:30:04 crc kubenswrapper[4724]: I1002 13:30:04.654676 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vddgq\" (UniqueName: \"kubernetes.io/projected/db162d47-0532-4318-b29e-ac37f78a58dd-kube-api-access-vddgq\") pod \"db162d47-0532-4318-b29e-ac37f78a58dd\" (UID: \"db162d47-0532-4318-b29e-ac37f78a58dd\") " Oct 02 13:30:04 crc kubenswrapper[4724]: I1002 13:30:04.654769 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gnshl\" (UniqueName: \"kubernetes.io/projected/62665e12-a18e-4b6c-b1e7-dda7ca1a7d36-kube-api-access-gnshl\") pod \"62665e12-a18e-4b6c-b1e7-dda7ca1a7d36\" (UID: \"62665e12-a18e-4b6c-b1e7-dda7ca1a7d36\") " Oct 02 13:30:04 crc kubenswrapper[4724]: I1002 13:30:04.654890 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-cache-config-data\" (UniqueName: \"kubernetes.io/secret/db162d47-0532-4318-b29e-ac37f78a58dd-image-cache-config-data\") pod \"db162d47-0532-4318-b29e-ac37f78a58dd\" (UID: \"db162d47-0532-4318-b29e-ac37f78a58dd\") " Oct 02 13:30:04 crc kubenswrapper[4724]: I1002 13:30:04.655043 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance-cache\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"62665e12-a18e-4b6c-b1e7-dda7ca1a7d36\" (UID: \"62665e12-a18e-4b6c-b1e7-dda7ca1a7d36\") " Oct 02 13:30:04 crc kubenswrapper[4724]: I1002 13:30:04.655133 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance-cache\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"db162d47-0532-4318-b29e-ac37f78a58dd\" (UID: \"db162d47-0532-4318-b29e-ac37f78a58dd\") " Oct 02 13:30:04 crc kubenswrapper[4724]: I1002 13:30:04.659917 4724 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/db162d47-0532-4318-b29e-ac37f78a58dd-kube-api-access-vddgq" (OuterVolumeSpecName: "kube-api-access-vddgq") pod "db162d47-0532-4318-b29e-ac37f78a58dd" (UID: "db162d47-0532-4318-b29e-ac37f78a58dd"). InnerVolumeSpecName "kube-api-access-vddgq". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 13:30:04 crc kubenswrapper[4724]: I1002 13:30:04.660192 4724 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage12-crc" (OuterVolumeSpecName: "glance-cache") pod "db162d47-0532-4318-b29e-ac37f78a58dd" (UID: "db162d47-0532-4318-b29e-ac37f78a58dd"). InnerVolumeSpecName "local-storage12-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 02 13:30:04 crc kubenswrapper[4724]: I1002 13:30:04.660211 4724 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage07-crc" (OuterVolumeSpecName: "glance-cache") pod "62665e12-a18e-4b6c-b1e7-dda7ca1a7d36" (UID: "62665e12-a18e-4b6c-b1e7-dda7ca1a7d36"). InnerVolumeSpecName "local-storage07-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 02 13:30:04 crc kubenswrapper[4724]: I1002 13:30:04.660287 4724 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/db162d47-0532-4318-b29e-ac37f78a58dd-image-cache-config-data" (OuterVolumeSpecName: "image-cache-config-data") pod "db162d47-0532-4318-b29e-ac37f78a58dd" (UID: "db162d47-0532-4318-b29e-ac37f78a58dd"). InnerVolumeSpecName "image-cache-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 13:30:04 crc kubenswrapper[4724]: I1002 13:30:04.670902 4724 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/62665e12-a18e-4b6c-b1e7-dda7ca1a7d36-kube-api-access-gnshl" (OuterVolumeSpecName: "kube-api-access-gnshl") pod "62665e12-a18e-4b6c-b1e7-dda7ca1a7d36" (UID: "62665e12-a18e-4b6c-b1e7-dda7ca1a7d36"). InnerVolumeSpecName "kube-api-access-gnshl". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 13:30:04 crc kubenswrapper[4724]: I1002 13:30:04.672806 4724 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/62665e12-a18e-4b6c-b1e7-dda7ca1a7d36-image-cache-config-data" (OuterVolumeSpecName: "image-cache-config-data") pod "62665e12-a18e-4b6c-b1e7-dda7ca1a7d36" (UID: "62665e12-a18e-4b6c-b1e7-dda7ca1a7d36"). InnerVolumeSpecName "image-cache-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 13:30:04 crc kubenswrapper[4724]: I1002 13:30:04.757265 4724 reconciler_common.go:293] "Volume detached for volume \"image-cache-config-data\" (UniqueName: \"kubernetes.io/secret/62665e12-a18e-4b6c-b1e7-dda7ca1a7d36-image-cache-config-data\") on node \"crc\" DevicePath \"\"" Oct 02 13:30:04 crc kubenswrapper[4724]: I1002 13:30:04.757301 4724 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vddgq\" (UniqueName: \"kubernetes.io/projected/db162d47-0532-4318-b29e-ac37f78a58dd-kube-api-access-vddgq\") on node \"crc\" DevicePath \"\"" Oct 02 13:30:04 crc kubenswrapper[4724]: I1002 13:30:04.757310 4724 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gnshl\" (UniqueName: \"kubernetes.io/projected/62665e12-a18e-4b6c-b1e7-dda7ca1a7d36-kube-api-access-gnshl\") on node \"crc\" DevicePath \"\"" Oct 02 13:30:04 crc kubenswrapper[4724]: I1002 13:30:04.757318 4724 reconciler_common.go:293] "Volume detached for volume \"image-cache-config-data\" (UniqueName: \"kubernetes.io/secret/db162d47-0532-4318-b29e-ac37f78a58dd-image-cache-config-data\") on node \"crc\" DevicePath \"\"" Oct 02 13:30:05 crc kubenswrapper[4724]: I1002 13:30:05.196869 4724 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-cache-glance-default-external-api-0-cleaner-29323538brpl" Oct 02 13:30:05 crc kubenswrapper[4724]: I1002 13:30:05.197000 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-cache-glance-default-external-api-0-cleaner-29323538brpl" event={"ID":"db162d47-0532-4318-b29e-ac37f78a58dd","Type":"ContainerDied","Data":"27dba60e659143f6bf285839d8b14a81555e5b2587e0d511b7de5027dd822b10"} Oct 02 13:30:05 crc kubenswrapper[4724]: I1002 13:30:05.197054 4724 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="27dba60e659143f6bf285839d8b14a81555e5b2587e0d511b7de5027dd822b10" Oct 02 13:30:05 crc kubenswrapper[4724]: I1002 13:30:05.202231 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-cache-glance-default-internal-api-0-cleaner-2932353b7j22" event={"ID":"62665e12-a18e-4b6c-b1e7-dda7ca1a7d36","Type":"ContainerDied","Data":"e8463b93819987b33134c152f7c02a6c3d739f055c2f7e8c323654866bdd6b31"} Oct 02 13:30:05 crc kubenswrapper[4724]: I1002 13:30:05.202286 4724 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e8463b93819987b33134c152f7c02a6c3d739f055c2f7e8c323654866bdd6b31" Oct 02 13:30:05 crc kubenswrapper[4724]: I1002 13:30:05.202382 4724 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-cache-glance-default-internal-api-0-cleaner-2932353b7j22" Oct 02 13:30:15 crc kubenswrapper[4724]: I1002 13:30:15.313415 4724 scope.go:117] "RemoveContainer" containerID="5cb607848e1db258b2539f214e6e2a3e1c03875bea1d097446a579c07b4f8d5a" Oct 02 13:30:15 crc kubenswrapper[4724]: E1002 13:30:15.315934 4724 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-74k4t_openshift-machine-config-operator(f6090eaa-c182-4788-950c-16352c271233)\"" pod="openshift-machine-config-operator/machine-config-daemon-74k4t" podUID="f6090eaa-c182-4788-950c-16352c271233" Oct 02 13:30:26 crc kubenswrapper[4724]: I1002 13:30:26.319301 4724 scope.go:117] "RemoveContainer" containerID="5cb607848e1db258b2539f214e6e2a3e1c03875bea1d097446a579c07b4f8d5a" Oct 02 13:30:26 crc kubenswrapper[4724]: E1002 13:30:26.320437 4724 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-74k4t_openshift-machine-config-operator(f6090eaa-c182-4788-950c-16352c271233)\"" pod="openshift-machine-config-operator/machine-config-daemon-74k4t" podUID="f6090eaa-c182-4788-950c-16352c271233" Oct 02 13:30:39 crc kubenswrapper[4724]: I1002 13:30:39.313793 4724 scope.go:117] "RemoveContainer" containerID="5cb607848e1db258b2539f214e6e2a3e1c03875bea1d097446a579c07b4f8d5a" Oct 02 13:30:39 crc kubenswrapper[4724]: E1002 13:30:39.314618 4724 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-74k4t_openshift-machine-config-operator(f6090eaa-c182-4788-950c-16352c271233)\"" pod="openshift-machine-config-operator/machine-config-daemon-74k4t" podUID="f6090eaa-c182-4788-950c-16352c271233" Oct 02 13:30:42 crc kubenswrapper[4724]: I1002 13:30:42.518722 4724 generic.go:334] "Generic (PLEG): container finished" podID="e22842c3-d802-495a-9576-a187044f44d8" containerID="fee9adebbd8e65566a16fa5a2e5a81aaa81ec9cd14e42657a02c6eeccafe89f4" exitCode=0 Oct 02 13:30:42 crc kubenswrapper[4724]: I1002 13:30:42.518810 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-slsj8/must-gather-pqpxq" event={"ID":"e22842c3-d802-495a-9576-a187044f44d8","Type":"ContainerDied","Data":"fee9adebbd8e65566a16fa5a2e5a81aaa81ec9cd14e42657a02c6eeccafe89f4"} Oct 02 13:30:42 crc kubenswrapper[4724]: I1002 13:30:42.519835 4724 scope.go:117] "RemoveContainer" containerID="fee9adebbd8e65566a16fa5a2e5a81aaa81ec9cd14e42657a02c6eeccafe89f4" Oct 02 13:30:43 crc kubenswrapper[4724]: I1002 13:30:43.173850 4724 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-slsj8_must-gather-pqpxq_e22842c3-d802-495a-9576-a187044f44d8/gather/0.log" Oct 02 13:30:46 crc kubenswrapper[4724]: I1002 13:30:46.463427 4724 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-t5rbz"] Oct 02 13:30:46 crc kubenswrapper[4724]: E1002 13:30:46.464694 4724 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="db162d47-0532-4318-b29e-ac37f78a58dd" containerName="glance-cache-glance-default-external-api-0-cleaner" Oct 02 13:30:46 crc kubenswrapper[4724]: I1002 13:30:46.464710 4724 state_mem.go:107] "Deleted CPUSet assignment" podUID="db162d47-0532-4318-b29e-ac37f78a58dd" containerName="glance-cache-glance-default-external-api-0-cleaner" Oct 02 13:30:46 crc kubenswrapper[4724]: E1002 13:30:46.464730 4724 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="62665e12-a18e-4b6c-b1e7-dda7ca1a7d36" containerName="glance-cache-glance-default-internal-api-0-cleaner" Oct 02 13:30:46 crc kubenswrapper[4724]: I1002 13:30:46.464737 4724 state_mem.go:107] "Deleted CPUSet assignment" podUID="62665e12-a18e-4b6c-b1e7-dda7ca1a7d36" containerName="glance-cache-glance-default-internal-api-0-cleaner" Oct 02 13:30:46 crc kubenswrapper[4724]: E1002 13:30:46.464750 4724 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c02cf745-080b-4911-9986-21363b80638f" containerName="collect-profiles" Oct 02 13:30:46 crc kubenswrapper[4724]: I1002 13:30:46.464756 4724 state_mem.go:107] "Deleted CPUSet assignment" podUID="c02cf745-080b-4911-9986-21363b80638f" containerName="collect-profiles" Oct 02 13:30:46 crc kubenswrapper[4724]: I1002 13:30:46.464882 4724 memory_manager.go:354] "RemoveStaleState removing state" podUID="62665e12-a18e-4b6c-b1e7-dda7ca1a7d36" containerName="glance-cache-glance-default-internal-api-0-cleaner" Oct 02 13:30:46 crc kubenswrapper[4724]: I1002 13:30:46.464902 4724 memory_manager.go:354] "RemoveStaleState removing state" podUID="c02cf745-080b-4911-9986-21363b80638f" containerName="collect-profiles" Oct 02 13:30:46 crc kubenswrapper[4724]: I1002 13:30:46.464918 4724 memory_manager.go:354] "RemoveStaleState removing state" podUID="db162d47-0532-4318-b29e-ac37f78a58dd" containerName="glance-cache-glance-default-external-api-0-cleaner" Oct 02 13:30:46 crc kubenswrapper[4724]: I1002 13:30:46.466038 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-t5rbz" Oct 02 13:30:46 crc kubenswrapper[4724]: I1002 13:30:46.479517 4724 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-t5rbz"] Oct 02 13:30:46 crc kubenswrapper[4724]: I1002 13:30:46.621594 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/52c5bfd8-dac5-4e7d-bb4d-794353aaba35-utilities\") pod \"redhat-operators-t5rbz\" (UID: \"52c5bfd8-dac5-4e7d-bb4d-794353aaba35\") " pod="openshift-marketplace/redhat-operators-t5rbz" Oct 02 13:30:46 crc kubenswrapper[4724]: I1002 13:30:46.621653 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b6fgq\" (UniqueName: \"kubernetes.io/projected/52c5bfd8-dac5-4e7d-bb4d-794353aaba35-kube-api-access-b6fgq\") pod \"redhat-operators-t5rbz\" (UID: \"52c5bfd8-dac5-4e7d-bb4d-794353aaba35\") " pod="openshift-marketplace/redhat-operators-t5rbz" Oct 02 13:30:46 crc kubenswrapper[4724]: I1002 13:30:46.621777 4724 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/52c5bfd8-dac5-4e7d-bb4d-794353aaba35-catalog-content\") pod \"redhat-operators-t5rbz\" (UID: \"52c5bfd8-dac5-4e7d-bb4d-794353aaba35\") " pod="openshift-marketplace/redhat-operators-t5rbz" Oct 02 13:30:46 crc kubenswrapper[4724]: I1002 13:30:46.723905 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/52c5bfd8-dac5-4e7d-bb4d-794353aaba35-utilities\") pod \"redhat-operators-t5rbz\" (UID: \"52c5bfd8-dac5-4e7d-bb4d-794353aaba35\") " pod="openshift-marketplace/redhat-operators-t5rbz" Oct 02 13:30:46 crc kubenswrapper[4724]: I1002 13:30:46.723968 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b6fgq\" (UniqueName: \"kubernetes.io/projected/52c5bfd8-dac5-4e7d-bb4d-794353aaba35-kube-api-access-b6fgq\") pod \"redhat-operators-t5rbz\" (UID: \"52c5bfd8-dac5-4e7d-bb4d-794353aaba35\") " pod="openshift-marketplace/redhat-operators-t5rbz" Oct 02 13:30:46 crc kubenswrapper[4724]: I1002 13:30:46.724041 4724 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/52c5bfd8-dac5-4e7d-bb4d-794353aaba35-catalog-content\") pod \"redhat-operators-t5rbz\" (UID: \"52c5bfd8-dac5-4e7d-bb4d-794353aaba35\") " pod="openshift-marketplace/redhat-operators-t5rbz" Oct 02 13:30:46 crc kubenswrapper[4724]: I1002 13:30:46.724681 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/52c5bfd8-dac5-4e7d-bb4d-794353aaba35-catalog-content\") pod \"redhat-operators-t5rbz\" (UID: \"52c5bfd8-dac5-4e7d-bb4d-794353aaba35\") " pod="openshift-marketplace/redhat-operators-t5rbz" Oct 02 13:30:46 crc kubenswrapper[4724]: I1002 13:30:46.724875 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/52c5bfd8-dac5-4e7d-bb4d-794353aaba35-utilities\") pod \"redhat-operators-t5rbz\" (UID: \"52c5bfd8-dac5-4e7d-bb4d-794353aaba35\") " pod="openshift-marketplace/redhat-operators-t5rbz" Oct 02 13:30:46 crc kubenswrapper[4724]: I1002 13:30:46.752733 4724 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b6fgq\" (UniqueName: \"kubernetes.io/projected/52c5bfd8-dac5-4e7d-bb4d-794353aaba35-kube-api-access-b6fgq\") pod \"redhat-operators-t5rbz\" (UID: \"52c5bfd8-dac5-4e7d-bb4d-794353aaba35\") " pod="openshift-marketplace/redhat-operators-t5rbz" Oct 02 13:30:46 crc kubenswrapper[4724]: I1002 13:30:46.788355 4724 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-t5rbz" Oct 02 13:30:47 crc kubenswrapper[4724]: I1002 13:30:47.223571 4724 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-t5rbz"] Oct 02 13:30:47 crc kubenswrapper[4724]: I1002 13:30:47.553143 4724 generic.go:334] "Generic (PLEG): container finished" podID="52c5bfd8-dac5-4e7d-bb4d-794353aaba35" containerID="3b6fb92eddb52a8fb54ee5b5a958cac473248fce8f1066c8371a42366c3cb883" exitCode=0 Oct 02 13:30:47 crc kubenswrapper[4724]: I1002 13:30:47.553203 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-t5rbz" event={"ID":"52c5bfd8-dac5-4e7d-bb4d-794353aaba35","Type":"ContainerDied","Data":"3b6fb92eddb52a8fb54ee5b5a958cac473248fce8f1066c8371a42366c3cb883"} Oct 02 13:30:47 crc kubenswrapper[4724]: I1002 13:30:47.553367 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-t5rbz" event={"ID":"52c5bfd8-dac5-4e7d-bb4d-794353aaba35","Type":"ContainerStarted","Data":"255aecfec454ebc79c35839e26158aaf6f58fa274978d14cae4b24f399101401"} Oct 02 13:30:47 crc kubenswrapper[4724]: I1002 13:30:47.555184 4724 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 02 13:30:50 crc kubenswrapper[4724]: I1002 13:30:50.473740 4724 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-slsj8/must-gather-pqpxq"] Oct 02 13:30:50 crc kubenswrapper[4724]: I1002 13:30:50.475163 4724 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-slsj8/must-gather-pqpxq" podUID="e22842c3-d802-495a-9576-a187044f44d8" containerName="copy" containerID="cri-o://45479da7a027aff4dd548c4932ea02fb96c3203f4582eed4d0e33646dd56813b" gracePeriod=2 Oct 02 13:30:50 crc kubenswrapper[4724]: I1002 13:30:50.480008 4724 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-slsj8/must-gather-pqpxq"] Oct 02 13:30:51 crc kubenswrapper[4724]: I1002 13:30:51.003615 4724 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-slsj8_must-gather-pqpxq_e22842c3-d802-495a-9576-a187044f44d8/copy/0.log" Oct 02 13:30:51 crc kubenswrapper[4724]: I1002 13:30:51.005707 4724 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-slsj8/must-gather-pqpxq" Oct 02 13:30:51 crc kubenswrapper[4724]: I1002 13:30:51.105959 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mx4x2\" (UniqueName: \"kubernetes.io/projected/e22842c3-d802-495a-9576-a187044f44d8-kube-api-access-mx4x2\") pod \"e22842c3-d802-495a-9576-a187044f44d8\" (UID: \"e22842c3-d802-495a-9576-a187044f44d8\") " Oct 02 13:30:51 crc kubenswrapper[4724]: I1002 13:30:51.106452 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/e22842c3-d802-495a-9576-a187044f44d8-must-gather-output\") pod \"e22842c3-d802-495a-9576-a187044f44d8\" (UID: \"e22842c3-d802-495a-9576-a187044f44d8\") " Oct 02 13:30:51 crc kubenswrapper[4724]: I1002 13:30:51.120946 4724 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e22842c3-d802-495a-9576-a187044f44d8-kube-api-access-mx4x2" (OuterVolumeSpecName: "kube-api-access-mx4x2") pod "e22842c3-d802-495a-9576-a187044f44d8" (UID: "e22842c3-d802-495a-9576-a187044f44d8"). InnerVolumeSpecName "kube-api-access-mx4x2". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 13:30:51 crc kubenswrapper[4724]: I1002 13:30:51.210655 4724 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mx4x2\" (UniqueName: \"kubernetes.io/projected/e22842c3-d802-495a-9576-a187044f44d8-kube-api-access-mx4x2\") on node \"crc\" DevicePath \"\"" Oct 02 13:30:51 crc kubenswrapper[4724]: I1002 13:30:51.241296 4724 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e22842c3-d802-495a-9576-a187044f44d8-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "e22842c3-d802-495a-9576-a187044f44d8" (UID: "e22842c3-d802-495a-9576-a187044f44d8"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 13:30:51 crc kubenswrapper[4724]: I1002 13:30:51.312065 4724 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/e22842c3-d802-495a-9576-a187044f44d8-must-gather-output\") on node \"crc\" DevicePath \"\"" Oct 02 13:30:51 crc kubenswrapper[4724]: I1002 13:30:51.590674 4724 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-slsj8_must-gather-pqpxq_e22842c3-d802-495a-9576-a187044f44d8/copy/0.log" Oct 02 13:30:51 crc kubenswrapper[4724]: I1002 13:30:51.592157 4724 generic.go:334] "Generic (PLEG): container finished" podID="e22842c3-d802-495a-9576-a187044f44d8" containerID="45479da7a027aff4dd548c4932ea02fb96c3203f4582eed4d0e33646dd56813b" exitCode=143 Oct 02 13:30:51 crc kubenswrapper[4724]: I1002 13:30:51.592243 4724 scope.go:117] "RemoveContainer" containerID="45479da7a027aff4dd548c4932ea02fb96c3203f4582eed4d0e33646dd56813b" Oct 02 13:30:51 crc kubenswrapper[4724]: I1002 13:30:51.592263 4724 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-slsj8/must-gather-pqpxq" Oct 02 13:30:51 crc kubenswrapper[4724]: I1002 13:30:51.616305 4724 scope.go:117] "RemoveContainer" containerID="fee9adebbd8e65566a16fa5a2e5a81aaa81ec9cd14e42657a02c6eeccafe89f4" Oct 02 13:30:51 crc kubenswrapper[4724]: I1002 13:30:51.690727 4724 scope.go:117] "RemoveContainer" containerID="45479da7a027aff4dd548c4932ea02fb96c3203f4582eed4d0e33646dd56813b" Oct 02 13:30:51 crc kubenswrapper[4724]: E1002 13:30:51.692107 4724 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"45479da7a027aff4dd548c4932ea02fb96c3203f4582eed4d0e33646dd56813b\": container with ID starting with 45479da7a027aff4dd548c4932ea02fb96c3203f4582eed4d0e33646dd56813b not found: ID does not exist" containerID="45479da7a027aff4dd548c4932ea02fb96c3203f4582eed4d0e33646dd56813b" Oct 02 13:30:51 crc kubenswrapper[4724]: I1002 13:30:51.692162 4724 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"45479da7a027aff4dd548c4932ea02fb96c3203f4582eed4d0e33646dd56813b"} err="failed to get container status \"45479da7a027aff4dd548c4932ea02fb96c3203f4582eed4d0e33646dd56813b\": rpc error: code = NotFound desc = could not find container \"45479da7a027aff4dd548c4932ea02fb96c3203f4582eed4d0e33646dd56813b\": container with ID starting with 45479da7a027aff4dd548c4932ea02fb96c3203f4582eed4d0e33646dd56813b not found: ID does not exist" Oct 02 13:30:51 crc kubenswrapper[4724]: I1002 13:30:51.692199 4724 scope.go:117] "RemoveContainer" containerID="fee9adebbd8e65566a16fa5a2e5a81aaa81ec9cd14e42657a02c6eeccafe89f4" Oct 02 13:30:51 crc kubenswrapper[4724]: E1002 13:30:51.693896 4724 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fee9adebbd8e65566a16fa5a2e5a81aaa81ec9cd14e42657a02c6eeccafe89f4\": container with ID starting with fee9adebbd8e65566a16fa5a2e5a81aaa81ec9cd14e42657a02c6eeccafe89f4 not found: ID does not exist" containerID="fee9adebbd8e65566a16fa5a2e5a81aaa81ec9cd14e42657a02c6eeccafe89f4" Oct 02 13:30:51 crc kubenswrapper[4724]: I1002 13:30:51.693955 4724 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fee9adebbd8e65566a16fa5a2e5a81aaa81ec9cd14e42657a02c6eeccafe89f4"} err="failed to get container status \"fee9adebbd8e65566a16fa5a2e5a81aaa81ec9cd14e42657a02c6eeccafe89f4\": rpc error: code = NotFound desc = could not find container \"fee9adebbd8e65566a16fa5a2e5a81aaa81ec9cd14e42657a02c6eeccafe89f4\": container with ID starting with fee9adebbd8e65566a16fa5a2e5a81aaa81ec9cd14e42657a02c6eeccafe89f4 not found: ID does not exist" Oct 02 13:30:52 crc kubenswrapper[4724]: I1002 13:30:52.336968 4724 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e22842c3-d802-495a-9576-a187044f44d8" path="/var/lib/kubelet/pods/e22842c3-d802-495a-9576-a187044f44d8/volumes" Oct 02 13:30:54 crc kubenswrapper[4724]: I1002 13:30:54.313786 4724 scope.go:117] "RemoveContainer" containerID="5cb607848e1db258b2539f214e6e2a3e1c03875bea1d097446a579c07b4f8d5a" Oct 02 13:30:54 crc kubenswrapper[4724]: E1002 13:30:54.314359 4724 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-74k4t_openshift-machine-config-operator(f6090eaa-c182-4788-950c-16352c271233)\"" pod="openshift-machine-config-operator/machine-config-daemon-74k4t" podUID="f6090eaa-c182-4788-950c-16352c271233" Oct 02 13:30:56 crc kubenswrapper[4724]: I1002 13:30:56.654864 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-t5rbz" event={"ID":"52c5bfd8-dac5-4e7d-bb4d-794353aaba35","Type":"ContainerStarted","Data":"1dcb8c6ea5b0ed67d3310230309795a481e514f4a9072248ecdff2cebe424a33"} Oct 02 13:30:57 crc kubenswrapper[4724]: I1002 13:30:57.683009 4724 generic.go:334] "Generic (PLEG): container finished" podID="52c5bfd8-dac5-4e7d-bb4d-794353aaba35" containerID="1dcb8c6ea5b0ed67d3310230309795a481e514f4a9072248ecdff2cebe424a33" exitCode=0 Oct 02 13:30:57 crc kubenswrapper[4724]: I1002 13:30:57.683065 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-t5rbz" event={"ID":"52c5bfd8-dac5-4e7d-bb4d-794353aaba35","Type":"ContainerDied","Data":"1dcb8c6ea5b0ed67d3310230309795a481e514f4a9072248ecdff2cebe424a33"} Oct 02 13:30:58 crc kubenswrapper[4724]: I1002 13:30:58.694972 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-t5rbz" event={"ID":"52c5bfd8-dac5-4e7d-bb4d-794353aaba35","Type":"ContainerStarted","Data":"7a6e9e39469c9105b90f6b53102621ab5da67a9555711881b522a5d39dc31e36"} Oct 02 13:30:58 crc kubenswrapper[4724]: I1002 13:30:58.715610 4724 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-t5rbz" podStartSLOduration=2.091046676 podStartE2EDuration="12.715582559s" podCreationTimestamp="2025-10-02 13:30:46 +0000 UTC" firstStartedPulling="2025-10-02 13:30:47.554909936 +0000 UTC m=+1912.009669057" lastFinishedPulling="2025-10-02 13:30:58.179445819 +0000 UTC m=+1922.634204940" observedRunningTime="2025-10-02 13:30:58.713325641 +0000 UTC m=+1923.168084762" watchObservedRunningTime="2025-10-02 13:30:58.715582559 +0000 UTC m=+1923.170341680" Oct 02 13:31:06 crc kubenswrapper[4724]: I1002 13:31:06.789227 4724 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-t5rbz" Oct 02 13:31:06 crc kubenswrapper[4724]: I1002 13:31:06.789796 4724 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-t5rbz" Oct 02 13:31:06 crc kubenswrapper[4724]: I1002 13:31:06.838891 4724 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-t5rbz" Oct 02 13:31:07 crc kubenswrapper[4724]: I1002 13:31:07.809179 4724 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-t5rbz" Oct 02 13:31:07 crc kubenswrapper[4724]: I1002 13:31:07.892638 4724 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-t5rbz"] Oct 02 13:31:07 crc kubenswrapper[4724]: I1002 13:31:07.937715 4724 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-qvzc9"] Oct 02 13:31:07 crc kubenswrapper[4724]: I1002 13:31:07.938018 4724 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-qvzc9" podUID="8ce22671-98d5-4e0e-9851-7da087e63499" containerName="registry-server" containerID="cri-o://d8775b14cae3f819cd3243ba5593afd2595fe02ebc12a7e0e1ead85887fad3e6" gracePeriod=2 Oct 02 13:31:08 crc kubenswrapper[4724]: I1002 13:31:08.764211 4724 generic.go:334] "Generic (PLEG): container finished" podID="8ce22671-98d5-4e0e-9851-7da087e63499" containerID="d8775b14cae3f819cd3243ba5593afd2595fe02ebc12a7e0e1ead85887fad3e6" exitCode=0 Oct 02 13:31:08 crc kubenswrapper[4724]: I1002 13:31:08.765188 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qvzc9" event={"ID":"8ce22671-98d5-4e0e-9851-7da087e63499","Type":"ContainerDied","Data":"d8775b14cae3f819cd3243ba5593afd2595fe02ebc12a7e0e1ead85887fad3e6"} Oct 02 13:31:09 crc kubenswrapper[4724]: E1002 13:31:09.051674 4724 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of d8775b14cae3f819cd3243ba5593afd2595fe02ebc12a7e0e1ead85887fad3e6 is running failed: container process not found" containerID="d8775b14cae3f819cd3243ba5593afd2595fe02ebc12a7e0e1ead85887fad3e6" cmd=["grpc_health_probe","-addr=:50051"] Oct 02 13:31:09 crc kubenswrapper[4724]: E1002 13:31:09.052012 4724 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of d8775b14cae3f819cd3243ba5593afd2595fe02ebc12a7e0e1ead85887fad3e6 is running failed: container process not found" containerID="d8775b14cae3f819cd3243ba5593afd2595fe02ebc12a7e0e1ead85887fad3e6" cmd=["grpc_health_probe","-addr=:50051"] Oct 02 13:31:09 crc kubenswrapper[4724]: E1002 13:31:09.052228 4724 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of d8775b14cae3f819cd3243ba5593afd2595fe02ebc12a7e0e1ead85887fad3e6 is running failed: container process not found" containerID="d8775b14cae3f819cd3243ba5593afd2595fe02ebc12a7e0e1ead85887fad3e6" cmd=["grpc_health_probe","-addr=:50051"] Oct 02 13:31:09 crc kubenswrapper[4724]: E1002 13:31:09.052262 4724 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of d8775b14cae3f819cd3243ba5593afd2595fe02ebc12a7e0e1ead85887fad3e6 is running failed: container process not found" probeType="Readiness" pod="openshift-marketplace/redhat-operators-qvzc9" podUID="8ce22671-98d5-4e0e-9851-7da087e63499" containerName="registry-server" Oct 02 13:31:09 crc kubenswrapper[4724]: I1002 13:31:09.313257 4724 scope.go:117] "RemoveContainer" containerID="5cb607848e1db258b2539f214e6e2a3e1c03875bea1d097446a579c07b4f8d5a" Oct 02 13:31:12 crc kubenswrapper[4724]: I1002 13:31:12.213288 4724 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-qvzc9" Oct 02 13:31:12 crc kubenswrapper[4724]: I1002 13:31:12.359364 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8ce22671-98d5-4e0e-9851-7da087e63499-catalog-content\") pod \"8ce22671-98d5-4e0e-9851-7da087e63499\" (UID: \"8ce22671-98d5-4e0e-9851-7da087e63499\") " Oct 02 13:31:12 crc kubenswrapper[4724]: I1002 13:31:12.359500 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8zs2r\" (UniqueName: \"kubernetes.io/projected/8ce22671-98d5-4e0e-9851-7da087e63499-kube-api-access-8zs2r\") pod \"8ce22671-98d5-4e0e-9851-7da087e63499\" (UID: \"8ce22671-98d5-4e0e-9851-7da087e63499\") " Oct 02 13:31:12 crc kubenswrapper[4724]: I1002 13:31:12.359643 4724 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8ce22671-98d5-4e0e-9851-7da087e63499-utilities\") pod \"8ce22671-98d5-4e0e-9851-7da087e63499\" (UID: \"8ce22671-98d5-4e0e-9851-7da087e63499\") " Oct 02 13:31:12 crc kubenswrapper[4724]: I1002 13:31:12.360240 4724 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8ce22671-98d5-4e0e-9851-7da087e63499-utilities" (OuterVolumeSpecName: "utilities") pod "8ce22671-98d5-4e0e-9851-7da087e63499" (UID: "8ce22671-98d5-4e0e-9851-7da087e63499"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 13:31:12 crc kubenswrapper[4724]: I1002 13:31:12.378779 4724 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8ce22671-98d5-4e0e-9851-7da087e63499-kube-api-access-8zs2r" (OuterVolumeSpecName: "kube-api-access-8zs2r") pod "8ce22671-98d5-4e0e-9851-7da087e63499" (UID: "8ce22671-98d5-4e0e-9851-7da087e63499"). InnerVolumeSpecName "kube-api-access-8zs2r". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 13:31:12 crc kubenswrapper[4724]: I1002 13:31:12.439316 4724 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8ce22671-98d5-4e0e-9851-7da087e63499-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "8ce22671-98d5-4e0e-9851-7da087e63499" (UID: "8ce22671-98d5-4e0e-9851-7da087e63499"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 13:31:12 crc kubenswrapper[4724]: I1002 13:31:12.461626 4724 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8zs2r\" (UniqueName: \"kubernetes.io/projected/8ce22671-98d5-4e0e-9851-7da087e63499-kube-api-access-8zs2r\") on node \"crc\" DevicePath \"\"" Oct 02 13:31:12 crc kubenswrapper[4724]: I1002 13:31:12.461660 4724 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8ce22671-98d5-4e0e-9851-7da087e63499-utilities\") on node \"crc\" DevicePath \"\"" Oct 02 13:31:12 crc kubenswrapper[4724]: I1002 13:31:12.461672 4724 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8ce22671-98d5-4e0e-9851-7da087e63499-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 02 13:31:12 crc kubenswrapper[4724]: I1002 13:31:12.790680 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-74k4t" event={"ID":"f6090eaa-c182-4788-950c-16352c271233","Type":"ContainerStarted","Data":"7e12baf62c8ba176877fe68b84d64798e7783e77514391c62cb74f38f38821b7"} Oct 02 13:31:12 crc kubenswrapper[4724]: I1002 13:31:12.793749 4724 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qvzc9" event={"ID":"8ce22671-98d5-4e0e-9851-7da087e63499","Type":"ContainerDied","Data":"e1bed5694c3423aac21b2bb5781b4804f24d22f170b53ba271787bd10dcaddd5"} Oct 02 13:31:12 crc kubenswrapper[4724]: I1002 13:31:12.793796 4724 scope.go:117] "RemoveContainer" containerID="d8775b14cae3f819cd3243ba5593afd2595fe02ebc12a7e0e1ead85887fad3e6" Oct 02 13:31:12 crc kubenswrapper[4724]: I1002 13:31:12.793834 4724 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-qvzc9" Oct 02 13:31:12 crc kubenswrapper[4724]: I1002 13:31:12.816096 4724 scope.go:117] "RemoveContainer" containerID="d2e75e274098d8b2aa30ca240780d309e6b590ae6a371187a68004c1ccac5305" Oct 02 13:31:12 crc kubenswrapper[4724]: I1002 13:31:12.844904 4724 scope.go:117] "RemoveContainer" containerID="8738163ca81747b4acc611e282a4daa506428591d0094f1fee997e649997f0ab" Oct 02 13:31:12 crc kubenswrapper[4724]: I1002 13:31:12.853993 4724 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-qvzc9"] Oct 02 13:31:12 crc kubenswrapper[4724]: I1002 13:31:12.867866 4724 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-qvzc9"] Oct 02 13:31:14 crc kubenswrapper[4724]: I1002 13:31:14.322073 4724 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8ce22671-98d5-4e0e-9851-7da087e63499" path="/var/lib/kubelet/pods/8ce22671-98d5-4e0e-9851-7da087e63499/volumes"